The technology disclosed in the present specification relates to an information processing device executing a response operation to a user's input operation and a method of controlling the information processing device, and more particularly, to an information processing device used by being worn by a user and a method of controlling the information processing device.
In recent years, wearable devices used by being worn on various locations on users' bodies have become widespread. Wearable devices have been used to detect biological information, positional information, and states of other users, to perform recording such as imaging or sound recording of surrounding conditions of users, and to present various pieces of information to users by voice or the like. Wearable devices have been used in various fields, for example, the fields of life logs and athletic support.
For example, a neck band type wearable device which is a wearing unit that surrounds half of a user's neck from both the right and left sides of the neck to the rear side (back side) and that is worn around the user's neck has been proposed (see, for example, Patent Literature 1).
Such a type of wearable device includes an operation unit on which a touch operation and a slide operation can be performed so that a user can perform an input operation. In addition, it is also possible to configure a wearable device to which a voice can be input using voice recognition. The user can give various instructions such as imaging, starting or stopping recording, starting or stopping voice reproduction, and requesting or stopping information presentation to the wearable device through an input operation and a voice input using the operation unit.
Patent Literature 1: WO2016/063587
An object of the technology disclosed in the present specification is to provide an excellent information processing device used by being worn on a user's body and capable of suitably executing a response operation with respect to the user's input operation, and a method of controlling the information processing device.
The technology disclosed herein has been devised in light of the problem described above, a first aspect thereof is an information processing device including: a housing which is worn by a user; a detection unit which detects an operation performed on the housing; and a control unit which controls a response operation with respect to a result of the detection.
The detection unit detects one or two or more of a deformation operation of the housing, an operation of touching the housing, an operation of grasping at least one of right and left tip portions of the U-shaped housing, an operation of tracing the surface of the housing, an operation of shifting a wearing position of the housing, a rotation operation or a pressing operation for operators capable of performing a rotation operation or a pressing operation and disposed at the right and left tip portions of the U-shaped housing, and extension and contraction operations of the right and left tip portions of the opening of the U-shaped housing.
In addition, a second aspect of the technology disclosed herein is a method of controlling an information processing device, the method including: a detection step of detecting an operation performed on a housing of the information processing device worn by a user; and a control step of controlling a response operation of the information processing device with respect to a result of the detection.
According to the technology disclosed in the present specification, it is possible to provide an excellent information processing device used by being worn on a user's body and capable of suitably executing a response operation with respect to the user's input operation, and a method of controlling the information processing device.
Note that the effects described in the present specification are merely examples, and effects of the present invention are not limited to these. Further, there is also a case where the present invention further provides additional effects other than the above-described effects.
Other objects, features and advantages of the technology disclosed in the present specification will become more clear from the detailed description based on an embodiment which will be described later and the accompanying drawings.
Hereinafter, an embodiment of the technology disclosed in the present specification will be described in detail with reference to the accompanying drawings.
Naturally, a user can also use the information processing device 100 by attaching the housing 101 to any area of the body other than the neck. Alternatively, it is also possible to use the information processing device 100 by the same method as a normal information terminal by disposing it on, for example, a desk without the housing 101 being worn on a user's body.
The information processing device 100 includes a sound collecting unit such as a so-called microphone, and the sound collecting unit collects target sounds uttered by the user wearing the device as sound information. For example, sound collecting units 111 to 113 may be disposed at a plurality of locations on the housing 101 such as the tip portion of the housing 101 and near the right and left of the face of the user wearing the housing 101 on his or her own neck.
In addition, although not shown in the drawing, an imaging unit is disposed at the other tip (not shown in
Meanwhile, at least some of the sound collecting units 111 to 113 in the vicinity of the mouth of the user and the imaging unit can also be configured as external devices of the information processing device 100 which are separated from the housing 101, but it will be assumed in the following description that the sound collecting units 111 to 113 and the imaging unit are supported by the housing 101.
Referring back to
The information processing device 100 can recognize the user's utterance by analyzing the user's voice (sound information) collected by the sound collecting units 111 to 113 by performing analysis based on a voice recognition technique or a natural language processing technique. For example, the information processing device 10 may recognize instruction content from the user's utterance and execute various processes (applications) in accordance with results of the recognition.
Meanwhile, it is assumed that the information processing device 100 is used outdoors as a wearable device carried by a user. In other words, the information processing device 100 is used in an environment where various types of noises such as wind noise, noise accompanying vibration, and rustling accompanying the wearing of the device 100 are generated randomly. It is possible to extract a user's voices with high sensitivity even under a random noise environment by disposing the plurality of sound collecting units 111 to 113 at appropriate locations and performing a beam forming process on voice signals collected by the sound collecting units 111 to 113. However, details of the beam forming process will not be described.
In the information processing device 300, a pair of protrusion portions 302 and 303 protrude in the vicinity of the opening of the housing 301, that is, under the right and left tip portions. When the housing 301 is worn on the user's neck, the tips of the protrusion portions 302 and 303 abut against the vicinity of the user's collarbone. Therefore, it is possible to set an angle θ at which a plane formed by the ring of the housing 301 is inclined to a horizontal direction to a desired value by setting the protrusion portions 302 and 303 to appropriate lengths. That is, the protrusion portions 302 and 303 play a role of adjusting the inclination of the housing 301 in a state in which the user wears the housing 301 on his or her own neck.
In addition,
In the information processing device 400, inclination adjustment portions 402 and 403 having a width that increases toward the front side are disposed in the vicinity of the opening of the housing 401, that is, at the right and left tip portions. When the housing 401 is worn on the user's neck, lower edges of the inclination adjustment portions 402 and 403 abut against the vicinity of the user's collarbone. Therefore, it is possible to set an angle θ at which a plane formed by the ring of the housing 401 is inclined to a horizontal direction to a desired value by setting the inclination adjustment portions 402 and 403 to appropriate widths.
When a beam forming process (described above) is performed on voice signals collected by the plurality of sound collecting units disposed on the neck band type housing, it is possible to optimize the process as long as it is possible to estimate the position of the mouth of the user serving as a sound source with a high level of accuracy.
A perpendicular line V descending from the user's mouth to the plane P intersects the circle C formed by the ring of the housing 501 by appropriately selecting an angle θ at which the plane P formed by the ring of the housing 501 is inclined to a horizontal direction and a diameter d of the housing 501. By forming such a geometric relationship, the perpendicular line V and the circle C intersect each other regardless of the degree of a wearing deviation of the housing 501, and thus it is possible to maintain high accuracy of the beam forming process. Therefore, the information processing devices 300 and 400 illustrated in
Here, unlike the information processing devices 100, 300, and 400 illustrated in
In the housing 601 of the information processing device 600, sound collecting units 611, 612, 613, and 614 such as so-called microphones are disposed at the above-described L-shaped bent portion and the above-described L-shaped tip portion, in the vicinity of the face of the user wearing the housing 601 on his or her own neck, and in the vicinity of the left tip of the housing 601. In addition, although not shown in the drawing, the housing 601 of the information processing device 600 also supports one or more imaging units, so that it is possible to image the front side of the user wearing the housing 601 and the surrounding scenery.
Here, additionally, the disposition of the plurality of sound collecting units 611, 612, 613, and 614 has the following effects (A) to (D) (see, for example, Patent Literature 1).
(A) Since the sound collecting units 611 and 612 are linearly disposed in the direction of the mouth of a user uttering a target sound, it is possible to efficiently emphasize the user's voice through a beam forming process.
(B) Since the sound collecting units 611 and 612 are linearly disposed in the direction of the ground that noise desired to be reduced reaches, it is possible to efficiently suppress noise through a beam forming process.
(C) Since the plurality of sound collecting units 611 to 614 are sterically disposed, it is possible to suppress noise reaching from all directions through a beam forming process.
(D) Since the sound collecting unit 611 is disposed closest to the mouth of a user wearing the housing 601, it is possible to acquire a voice uttered by the user with a larger volume than other noises.
In addition,
In addition,
The information processing device 100 illustrated in the drawing includes the plurality of sound collecting units 111 to 113, an imaging unit 120, a voice output unit 130, an operation unit 140, a sensor unit 150, a communication unit 160, and a control unit 170. Meanwhile, although not shown in
The sound collecting units 111 to 113 acquire voice data, such as a voice uttered by a user wearing the housing 101 of the information processing device 100 on his or her own neck or a surrounding voice, for a beam forming process. Each of the sound collecting units 111 to 113 is constituted by, for example, a microphone. Here, the microphone may be a microphone having sensitivity in all directions (or a semi-directional microphone) instead of a directional microphone. Having sensitivity in all directions means that there is no insensitive region (direction) in a polar pattern. The sound collecting units 111 to 113 may include a microphone amplifier circuit that amplifies an obtained voice signal and an A/D converter that converts a signal into a digital signal. The sound collecting units 111 to 113 output the acquired voice data to the control unit 170.
The imaging unit 120 includes a lens system which is constituted by an imaging lens, an aperture, a zoom lens, a focus lens, and the like, a driving system that causes the lens system to perform a focus operation and a zoom operation, an imaging element that performs photoelectric conversion of imaging light obtained by the lens system to generate an imaging signal, and the like (none of which is shown). The imaging element is constituted by, for example, a charge coupled device (CCD) sensor array or a complementary metal oxide semiconductor (CMOS) sensor array. The imaging unit 120 is disposed at a location where the front side of the user can be imaged in a state where the housing 101 of the information processing device 100 is worn on the user's neck, and the optical axis of a lens is directed. In this case, the imaging unit 120 can image, for example, a speaking partner who is in front of the user. Alternatively, the imaging unit 120 may be disposed at a location where the face of the user himself or herself can be imaged in a state where the housing 101 of the information processing device 100 is worn on the user's neck. The imaging unit 120 may automatically perform an imaging operation in accordance with the occurrence of an event such as the start of conversation between the user and a speaking partner, or may start or stop an imaging operation in accordance with an instruction input by the user. The imaging unit 120 outputs digitalized image data to the control unit 170.
The voice output unit 130 is constituted by a speaker or the like, and for example, reproduces and outputs music during jogging of a user wearing the housing 101 of the information processing device 100 on his or her own neck or sends appropriate voice guidance when the user comes to a point of interest (POI). In addition, the voice output unit 130 may be constituted by, for example, a stereo type speaker and may localize a sound image.
The operation unit 140 can be operated by a user using a fingertip and has a function of receiving an input from the user. The operation unit 140 is constituted by a mechanical operator such as a button and a switch, a knob, or a slider, a touch sensor on which a fingertip operation such as tapping, swiping, flicking, or pinch-out can be performed, or the like. The user can give an instruction for turning on or turning off the power supply of the information processing device 100, starting or terminating imaging of the imaging unit 120, starting or terminating input of voices from the sound collecting units 111 to 113, starting or terminating output of voice information from the voice output unit 130, and the like through the operation unit 140. Meanwhile, in the technology disclosed in the present specification, a user can easily or intuitively perform an input operation by generating a predetermined event with respect to the sound collecting units 111 to 113, the imaging unit 120, the sensor unit 15, or the like without using the operation unit 140 (or using the operation unit 140 in combination), and details thereof will be described later.
The sensor unit 150 has various functions of detecting the state of a user wearing the housing 101 of the information processing device 100 on his or her own neck and the surrounding state of the user. The sensor unit 150 includes at least one sensor module among, for example, an acceleration sensor, a speed sensor, a gyro sensor, a geomagnetic sensor, a global positioning system (GPS) sensor, and a vibration sensor. In addition, all of the sensor modules constituting the sensor unit 150 are not required to be accommodated in the housing 101, and it is also possible to constitute at least some of them as a device externally connected to the information processing device 100 separated from the housing 101. For example, a wristband type pulsation sensor worn on a user's wrist or a vibration sensor built into a smartphone carried by a user may be utilized as a portion of the sensor unit 150. Meanwhile, the sensor unit 150 may include a biological sensor that acquires a user's biological information such as body temperature, pulsation, myoelectric potential, and perspiration.
The communication unit 160 is a communication module for the information processing device 100 to transmit and receive data to and from an external device in a wired or wireless manner or an input and output interface module for the information processing device 100 to input and output data to and from an external device. The communication unit 160 performs data communication with an external device directly or via an access point on a network in accordance with a wired local area network (LAN) such as Ethernet (registered trademark), a wireless LAN such as wireless fidelity (Wi-Fi) (registered trademark), Bluetooth (registered trademark) communication, near field communication (NFC), infrared communication such as IrDA, or other communication methods. The communication unit 160 wirelessly transmits voices collected by the sound collecting units 111 to 113 (or voice data further subjected to a beam forming process by the control unit 170 or voice recognition results) or image data such as a still image or a moving image captured by the imaging unit 120 to an external device in accordance with, for example, an instruction given from the control unit 170.
In addition, the communication unit 160 may include an interface and a receptacle for data transport such as a universal serial bus (USB), a high definition multimedia interface (HDMI) (registered trademark), a mobile high-definition link (MHL), and the like. The communication unit 160 outputs voices collected by the sound collecting units 111 to 113 and image data captured by the imaging unit 120 to an external device using such an interface in order to reproduce, output, and store the sounds and the image data. Naturally, it is also possible to take data such as an image from an external device into the information processing device 100 using such an interface.
In addition, the communication unit 160 may be equipped with a cellular communication protocol such as an LTE-advanced (LTE-A) to have a conversation function. A user can have a conversation using the communication unit 160 when the user wears the housing 101 of the information processing device 100 on his or her own body such as the neck and is in outdoor activity.
Meanwhile, in a case where at least some of the functions of the control unit 170 to be described later are included in a smartphone carried by the user or another device such as a server on a cloud (none of which is shown), the communication unit 160 may transmit data acquired by at least one of the sound collecting units 111 to 113, the imaging unit 120, the operation unit 140, and the sensor unit 150 to another device functioning as the control unit 170 and may receive processing results for the transmitted data (for example, results of a beam forming process and a voice recognition process) from another device such as a server. In addition, the communication unit 160 may output voice data received from other devices as a voice from the voice output unit 150.
Further, in a case where at least one of the sound collecting units 111 to 113, the imaging unit 120, the operation unit 140, and the sensor unit 150 utilizes a function equipped by an external device (a smartphone carried by a user, or the like), the communication unit 160 may receive data acquired by the external device and output the received data to the control unit 170.
The control unit 170 is constituted by an integrated circuit such as a system-on-a-chip (SoC). A plurality of circuit modules for realizing functions such as a main controller, a main memory, a digital signal processing unit (video DSP) for video, a digital signal processing unit for sound, and a digital signal processing unit (audio DSP) for sound are mounted on the SoC serving as the control unit 170. In addition, the above-described functions of the communication unit 160 can also be mounted on the same SoC as the control unit 170.
The control unit 170 (specifically, a main controller mounted on the SoC) controls the overall operation in the information processing device 100 in accordance with various programs. Specifically, the control unit 170 performs a beam forming process of voice signals collected by the plurality of sound collecting units 111 to 113, a voice recognition process based on voice data subjected to the beam forming process, imaging operation control (the start and termination of imaging, a developing process, and the like) of the imaging unit 120, output of acquired data (voice data, image data, sensor data, and the like) from the communication unit 160 to the outside, control of a device operation according to an instruction input by a user through the operation unit 140, and the like.
Meanwhile, at least some of the functions of the control unit 170 may be included in a smartphone carried by a user or another device such as a server on a cloud (none of which is shown). In such a case, the control unit 170 disposed on another device communicates with the main body side of the information processing device 100 through the communication unit 160, executes various processes such as a beam forming process and a voice recognition process on the basis of data received from the main body of the information processing device 100, and transmits processing results to the communication unit 160.
The neck band type information processing device 100 is used by being worn on a user's neck in various fields, for example, the fields of life logs and athletic support. The information processing device 100 provides, for example, services for reproducing and outputting music during jogging of a user and sending an appropriate voice guidance when the user comes to a POI. In addition, as described above, the information processing device 100 is equipped with the operation unit 140 constituted by a button and a switch, a knob, a switch, a touch sensor, or the like, and basically, a user can give an instruction for starting, stopping, and temporarily stopping the execution of a service, and the like through the operation unit 140.
However, most of the housing 101 of the information processing device 100 worn on a user's neck is hidden by the face (chin) of the user, and thus it is difficult to visually check the position and operation contents of the operation unit 140. In addition, since the surface area of the housing 101 is small and the operation unit 140 is configured as a small part, it is difficult to accurately operate the operation unit in an invisible situation. Naturally, the configuration and disposition of the operation unit 140 are not standard like a QWERTY array type keyboard, and it is difficult for a user to promptly and accurately touch and operate the operation unit 140. In addition, there is a limitation on a location where parts such as a button and a switch, a knob, a switch, and a touch sensor can be disposed on the surface of the housing 101 having a small area, and thus it can also be said that it is difficult to receive all user input operations for the information processing device 100 using only the operation unit 140.
For example, when a user is wearing the neck band type information processing device 100 on his or her own neck and is acting while listening to music, when the user gets on the public transportation vehicle or an elevator, or when someone suddenly speaks to the user with a large sound volume, the user may feel embarrassed because it is not possible to accurately operate the operation unit 140. Initially, since the surface area of the housing 101 is narrow, it may be difficult in terms of design to dispose a plurality of parts (a button and a switch, a touch sensor, and the like) of the operation unit 140 on the surface of the housing 101 in a form with good operability.
Additionally, in a case where the information processing device 100 is equipped with a voice user interface (UI) function, a voice input performance may be deteriorated in an environment where a noise level is large such as outdoors, and thus there is a concern that a user may be stressed.
Consequently, in the present specification, a technique for enabling a user to perform an input operation by utilizing the shape of the housing 101 of the neck band type information processing device 100 will be proposed below. As will be described later, a user can easily or intuitively perform an input operation by causing a predetermined event with respect to the sound collecting units 111 to 113, the imaging unit 120, the sensor unit 15, the housing 101, or the like without using the operation unit 140.
As can be understood from
For example, when a user gets on the public transportation vehicle or an elevator or when someone suddenly speaks to the user, the user does not need to feel embarrassed as long as the user can stop reproducing music or temporarily reproduce music on a spot decision.
Hereinafter, operation examples of the neck band type information processing device 100 with respect to the sound collecting units 111 to 113, the imaging unit 120, and the housing 101, a method of measuring operation amounts in the operation examples, and the like will be described. Meanwhile, although illustration and description are omitted, it should be understood that the same housing operation and operation amount measurement method can be applied to other types of information processing devices 300, 400, 600, and the like.
(1) Housing Deformation Operation
In a state where a user is wearing the information processing device 100 on his or her own neck, the opening portion of the U-shaped housing 101 faces forward (described above, see
In addition, the user performs a fingertip operation (pinch-out) of opening the thumb and the index finger by inserting the thumb and the index finger of the left hand (or the right hand) into both the right and left tip portions of the U-shaped housing 101 as illustrated in
At least a portion (for example, a U-shaped bottom) of the housing 101 is formed of an elastic material, so that opening and closing operations of the U-shaped housing 101 as illustrated in
Examples of a method of detecting that a user has performed opening and closing operations of the U-shaped housing 101 include a method of measuring a distance of a gap of the opening portion, a method of measuring the movement of at least one of the right and left tip portions, a method of measuring a distortion amount of bendable portion of the housing 101, and the like. It is assumed that the sensor unit 150 includes a sensor that detects that opening and closing operations have been performed by any one of these methods.
As an example of the method of measuring a distance of a gap of the opening portion, a method of disposing a distance measurement sensor in the vicinity of the tip of the U-shaped housing 101 is conceivable. For example, in a case of a distance measurement sensor using infrared light, an infrared light emitting element is disposed at either one of the right and left tips of the opening portion, and an infrared light receiving element is disposed at the other tip. First, features of changes in sensor values of the distance measurement sensor which occur when a user has performed an operation of deforming the housing 101 are recorded in advance in association with identification information of the deformation operation. Further, in a case where the recorded sensor values are obtained from the output of the distance measurement sensor while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of deforming the housing 101.
A distance of a gap of the opening portion is hardly changed except when the user has performed a deformation operation. Therefore, according to a method of measuring a distance of a gap of the opening portion, it is possible to stably detect that the user has performed an operation of deforming the housing 101.
However, the distance measurement sensor measuring a distance of a gap of the opening portion is hardly used for purposes other than a purpose of detecting an operation of deforming the housing 101 by a user and it is difficult to divert it to other uses. In addition, it is difficult to design the inside of the housing 101 having a narrow accommodation space because a location for installing the distance measurement sensor is limited in order to measure a distance with high accuracy to a certain extent. In addition, there is also a concern that the addition of the distance measurement sensor having no other uses may result in an increase in the burden on manufacturing costs due to an increase in the cost of parts.
In addition, as an example of the method of measuring the movement of the tip portions of the U-shaped housing 101, a method of disposing an acceleration sensor at at least one tip portion to measure the movement of the tip portion accompanying a user's opening and closing operations illustrated in
In addition to a case where the user has performed a deformation operation of opening and closing the opening portion, the acceleration of the tip portion of the housing 101 continuously fluctuates as long as the user moves. For this reason, in the method of measuring the movement of the tip portion of the U-shaped housing 101 using the acceleration sensor, an algorithm for preventing changes in acceleration, occurring except when a desired deformation operation is performed, from being erroneously detected is required. For example, erroneous detection may be prevented by discriminating between changes in acceleration when the user has performed an operation of deforming the housing 101 and changes in acceleration which have occurred in other cases by machine learning. For example, the accuracy of detection may be improved by adopting deep learning.
Meanwhile, the information processing device 100 is equipped with an acceleration sensor as a portion of the sensor unit 150 for the purpose of detecting the state of a user wearing the housing 101 on his/her neck and the surrounding state of the user (described above). Therefore, since the acceleration sensor can be used in combination with other uses, an increase in costs is reduced and the number of parts is not increased according to a method of measuring the movement of the tip portion of the U-shaped housing 101, thereby reducing the burden on design.
In addition, as an example of the method of measuring a distortion amount of a bendable portion of the housing 101, a method of disposing a dynamic sensor such as a distortion gauge at the bendable portion is conceivable. For example, the distortion gauge, which is a device that measures changes in electric resistance due to the deformation of a resistor of a metal by a bridge circuit, can convert the measured changes in electric resistance into a distortion amount. First, features of changes in electric resistance of the distortion gauge which occur when a user has performed an operation of deforming the housing 101 are recorded in advance in association with identification information of the deformation operation. Further, in a case where the recorded changes in electric resistance are obtained from the distortion gauge while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of deforming the housing 101.
When it can be assumed that the housing 101 is not bent (or bending does not occur at a location where the distortion gauge is disposed) except when the user has performed a deformation operation, it is possible to stably detect that the user has performed an operation of deforming the housing 101 by a method of measuring a distortion amount of the bendable portion of the housing 101.
However, the distortion gauge measuring a distortion amount of the bendable portion of the housing 101 is hardly used for purposes other than a purpose of detecting an operation of deforming the housing 101 by a user and is hardly diverted to other uses, and thus there is a concern that an increase in the cost of parts may result in an increase of the burden on manufacturing costs. In addition, although the distortion gauge is used, for example, by being attached to the surface of the bendable portion of the housing 101, there is also a concern that the wiring design for pulling a signal line from the distortion gauge into the housing 101 may become complicated.
Meanwhile, the method of measuring a distance of a gap of the opening portion of the housing is based on the assumption that the housing is a structure having an opening portion such as a U-shaped portion. Therefore, the method is an operation method which cannot be applied to the information processing devices 700 and 800 constituted by a housing, not having an opening portion, which has a closed shape, such as an O-shape as illustrated in
(2) Operation of Touching Tip Portion of Housing
In a state where a user is wearing the information processing device 100 on his or her own neck, the opening portion of the U-shaped housing 101 faces forward (described above, see
The U-shaped opening portion faces forward in a state where the user wears the housing 101 worn around the user's neck and surrounding the neck by half from both the right and left sides of the neck to the rear side (back side). Therefore, the user can perform an operation of touching, holding, and covering either or both of the right and left tip portions of the U-shaped opening portion as illustrated in
There are various methods of detecting that a user has touched the surface of the housing 101. An example of one of the methods is a method of disposing a proximity sensor or a touch sensor at a location where a user's touch operation is scheduled to be performed, such as the tip portion of the housing 101. In this case, first, features of changes in sensor values of the proximity sensor or the touch sensor which occur when a user has performed an operation of touching the location of the housing 101 are recorded in advance in association with identification information of the touch operation. Further, in a case where the recorded sensor values are obtained from the output of the proximity sensor or the touch sensor while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of touching the location of the housing 101.
However, in a case where the proximity sensor or the touch sensor is hardly used for purposes other than a purpose of detecting an operation of a user touching the housing 101 and is hardly diverted to other uses, there is a concern that the cost of parts may increase and the design of wiring may become complicated.
Consequently, as another method of detecting that a user has touched the surface of the housing 101, a method of detecting an operation of the user touching the housing 101 by utilizing the plurality of sound collecting units 111 to 113 constituted by a microphone or the like and the imaging unit 120 will also be proposed.
First, a method of detecting that a user has touched the surface of the housing 101 by utilizing the sound collecting units 111 to 113 will be described.
The sound collecting units 111 to 113 are disposed at a plurality of locations in addition to the tip portion of the U-shaped housing 101. In addition, when a hole for the microphone of any one of the sound collecting units 111 to 113 is blocked due to a user's operation of touching the surface of the housing 101 with a fingertip as illustrated in
Therefore, first, features of sound pressure changes, occurring in the sound collecting units 111 to 113 due to the blocking of the holes for the microphones when the user has performed an operation of touching a specific location of the housing 101, are recorded in advance in association with identification information of an operation performed on, such as a touched location. Further, in a case where the recorded sound pressure changes are obtained from any one of the sound collecting units 111 to 113 while the user is using the information processing device 100, it is possible to detect that the user has performed a touch operation at a corresponding location of the housing 101. Further, as in the present embodiment, in a configuration in which the sound collecting units 111 to 113 are disposed at a plurality of locations on the housing 101, it is possible not only to define a response operation on the information processing device 100 side for each location where a sound pressure change (that is, a touch) has been detected but also to define a response operation for each combination of two or more locations where a sound pressure change has occurred.
There is a possibility that sound pressure changes of the sound collecting units 111 to 113 may also occur due to an event other than the blocking of the holes for the microphones of the sound collecting units 111 to 113 by a user's hand. For this reason, in the method of detecting that a user has touched the surface of the housing 101 using the sound collecting units 111 to 113, an algorithm for preventing a sound pressure change, occurring except when a user's touch operation is performed, from being erroneously detected is required. For example, erroneous detection may be prevented by discriminating between sound pressure changes of the sound collecting units 111 to 113 occurring when the user has performed an operation of touching the housing 101 and sound pressure changes of the sound collecting units 111 to 113 occurring in other cases by machine learning. For example, the accuracy of detection may be improved by adopting deep learning.
Consequently, the method of detecting that a user has touched the surface of the housing 101 by utilizing the imaging unit 120 will be described.
The imaging unit 120 is disposed at the tip portion of the U-shaped housing 101 in order to easily image a subject on the front side. Naturally, another imaging unit may be installed at a location other than the tip portion. When the front side of the imaging unit 120 (an optical axis direction of a lens) of the imaging unit 120 is blocked due to a user's operation of touching the surface of the housing 101 with a fingertip as illustrated in
Therefore, first, features of luminance changes, occurring in the imaging unit 120 when the user has performed an operation of touching the imaging unit 120 at the tip portion of the housing 101, are recorded in advance in association with identification information of an operation performed on, such as a touched location. Further, in a case where the recorded luminance changes are obtained from the imaging unit 120 while the user is using the information processing device 100, it is possible to detect that the user has performed a touch operation at a corresponding location of the housing 101. Further, in a configuration in which an imaging unit is disposed at a plurality of locations on the housing 101, it is possible not only to define a response operation on the information processing device 100 side for each location where a luminance change (that is, a touch) has been detected but also to define a response operation for each combination of two or more locations where a luminance change has occurred. Additionally, it is also possible to define a response operation on the information processing device 100 side with respect to combinations of a location where a luminance change has been detected by the imaging unit 120 and locations where sound pressure changes have been detected by the sound collecting units 111 to 113.
There is a possibility that a luminance change of the imaging unit 120 may also occur due to an event other than the touch of the imaging unit 120 by a user's hand. For this reason, in the method of detecting that a user has touched the surface of the housing 101 using the imaging unit 120, an algorithm for preventing a luminance change, occurring except when a user's touch operation is performed, from being erroneously detected is required. For example, erroneous detection may be prevented by discriminating between luminance changes of the imaging unit 120 occurring when the user has performed an operation of touching the housing 101 and luminance changes of the imaging unit 120 occurring in other cases by machine learning. For example, the accuracy of detection may be improved by adopting deep learning.
As described above with reference to
(3) Operation of Grasping Tip Portion of Housing
In a state where a user wears the information processing device 100 on his or her own neck, the opening portion of the U-shaped housing 101 faces forward (described above, see
The U-shaped opening portion faces forward in a state where the user wears the housing 101 worn around the user's neck and surrounding the neck by half from both the right and left sides of the neck to the rear side (back side). Therefore, the user can perform an operation of grasping either or both of the right and left tip portions of the U-shaped opening portion as illustrated in
There are various methods of detecting that a user has grasped the tip portion of the U-shaped housing 101. For example, it is possible to detect that the left tip portion has been grasped on the basis of a sound pressure change of the sound collecting unit 111 disposed at the left tip portion of the housing 101. In addition, it is possible to detect that the right tip portion has been grasped on the basis of a luminance change of the imaging unit 120 disposed at the right tip portion of the housing. An algorithm such as machine learning may be introduced in order to prevent a sound pressure change and a luminance change from being erroneously detected (described above).
In the method of detecting a sound pressure change of the sound collecting unit 111 and a luminance change of the imaging unit 120, there is a problem that it is not possible to discriminate whether a tip portion has been touched or grasped. Consequently, the grasping of the tip portion may be detected on the basis of changes in pressure by disposing pressure-sensitive sensors at the right and left tip portions of the housing 101. In this case, first, features of changes in sensor values of the pressure-sensitive sensors which occur when a user has performed an operation of grasping the right and left tip portions of the housing 101 are recorded in advance. Further, in a case where the recorded sensor values are obtained from the output of the pressure-sensitive sensors while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of grasping the tip portions of the housing 101. Naturally, it may be detected that the user has grasped the tip portions by combining the sensor values of the pressure-sensitive sensors and detection results of a sound pressure change of the sound collecting unit 111 and a luminance change of the imaging unit 120 with each other.
(4) Operation of Tracing Surface of Housing
In a state where a user wears the information processing device 100 on his or her own neck, the opening portion of the U-shaped housing 101 faces forward (described above, see
Meanwhile, as illustrated in
The U-shaped opening portion faces forward in a state where the user wears the housing 101 worn around the user's neck and surrounding the neck by half from both the right and left sides of the neck to the rear side (back side). Therefore, the user can perform a tracing operation on either or both of the right and left side surfaces of the housing 101 as illustrated in
Here, there are two operations of tracing the right and left side surfaces of the housing 101, respectively, frontwards and backwards. Therefore, a user can performs the following eight input operations by combining operations to be performed on the right and left side surfaces of the housing 101 with each other. In addition, it is possible to define a response operation on the information processing device 400 side with respect to each of the operations.
There are various methods of detecting that a user has traced the side surface of the housing 101. For example, it is possible to detect a contact position and movement of a fingertip by disposing capacitance type touch sensors on the right and left side surfaces of the housing 101. In this case, first, features of changes in sensor values of the touch sensors which occur when a user has performed an operation of tracing the right and left side surfaces of the housing 101 are recorded in advance in association with an operation surface and an operation direction. Further, in a case where the recorded sensor values are obtained from the output of the touch sensors while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of tracing either the right or left side surface of the housing 101.
Meanwhile, in addition to a case where the user traces the side face of the housing 101, the touch sensor reacts by the user touching the side face of the housing 101 with a fingertip or a portion of the body of the user other than the fingertip. For this reason, an algorithm for preventing the output of the touch sensor, other than a user's tracing operation with a fingertip, from being erroneously detected is required. For example, erroneous detection may be prevented by performing machine learning of the output of the touch sensor when the user has performed an operation of tracing the side surface of the housing 101. For example, the accuracy of detection may be improved by adopting deep learning.
(5) Operation of Shifting Wearing Position of Housing
In a state where a user wears the information processing device 100 normally (or usually) on his or her own neck, the opening portion of the U-shaped housing 101 faces in the front direction of the user. Meanwhile, the user can rotate the housing 101 around his or her own neck to shift the opening to face rightward from the front side as illustrated in
The U-shaped opening portion faces forward in a state where the user wears the housing 101 worn around the user's neck and surrounding the neck by half from both the right and left sides of the neck to the rear side (back side). Therefore, the user can perform an operation of rotating the housing 101 around his or her own neck to shift a wearing position so that the opening faces either rightward or leftward from the front side as illustrated in
There are various methods of detecting that a user has shifted a wearing position of the housing 101 and detecting a shifting direction. For example, a method of measuring an acceleration generated when a user shifts a wearing position of the housing 101 by using an acceleration sensor included in the information processing device 100 as the sensor unit 150 is conceivable. First, features of changes in sensor values of the acceleration sensor which occur when a user has performed an operation of shifting a wearing position of the housing 101 worn around his or her own neck are recorded in advance in association with identification information of the operations of shifting the wearing position to the right and left. Further, in a case where the recorded sensor values are obtained from the output of the acceleration sensor while the user is using the information processing device 100, it is possible to detect that the user has performed an operation of shifting a wearing position of the housing 101 either rightward or leftward. Meanwhile, the movement of the tip portion may be measured with higher accuracy by combining a speed sensor, a gyro sensor, a vibration sensor, or the like with the acceleration sensor.
In addition to a case where the user has performed an operation of shifting a wearing position of the housing 101, the acceleration of the tip portion of the housing 101 continuously fluctuates as long as the user moves. For this reason, in the method of measuring an operation of shifting a wearing position of the housing 101 by the acceleration sensor, an algorithm for preventing changes in acceleration, occurring except when an operation of shifting a wearing position has been performed, from being erroneously detected is required. For example, erroneous detection may be prevented by discriminating between changes in acceleration when the user has performed an operation of shifting a wearing position of the housing 101 and changes in acceleration which have occurred in other cases by machine learning. For example, the accuracy of detection may be improved by adopting deep learning.
Meanwhile, the information processing device 100 is equipped with an acceleration sensor as a portion of the sensor unit 150 for the purpose of detecting the state of a user wearing the housing 101 on his/her neck and the surrounding state of the user (described above). Therefore, since the acceleration sensor can be used in combination with other uses, an increase in costs is reduced and the number of parts is not increased according to a method of measuring the movement of the tip portion of the U-shaped housing 101, thereby reducing the burden on design.
(6) Operation on Operator
Here, the user can perform the following eleven input operations by combining rotation operations and pressing operations with respect to the left and right protrusion portions 302 and 303 with each other. In addition, it is possible to define a response operation on the information processing device 300 side with respect to each of the operations.
(7) Extension and Contraction Operations of Housing with Respect to Tip Portion
As illustrated in
For example, the tip portions of the housing 101 are formed of an extensible material, so that extension and contraction operations as illustrated in
The U-shaped opening portion faces forward in a state where the user wears the housing 101 worn around the user's neck and surrounding the neck by half from both the right and left sides of the neck to the rear side (back side). Therefore, the user can perform an operation of extending and contracting either or both of the right and left tip portions as illustrated in
Examples of the input operations (1) to (7) capable of being mainly performed on the housing 101 of the information processing device 100 by a user have been described so far. It should be understood that the information processing device 100 is capable of not only performing any one of the input operations (1) to (7) but also using any two or more of the input operations (1) to (7) in combination.
When the above-described input operations are performed by a user, the information processing device 100 returns a response operation to the user (see
(11) Response Operation with Respect to Operation of Deforming Housing
When a user has performed the operations of deforming the housing as illustrated in
As another example of a response operation, an index of a list in an application being currently executed by the information processing device 100 may be changed by the number of deformation operations. For example, the information processing device 100 performs fast-forwarding or rewinding by the number of pieces of music equivalent to the number of deformation operations during music reproduction. Alternatively, a reproduction position of a content is jumped forward or backward by the number of chapters equivalent to the number of deformation operations.
A deformation operation of closing the opening of the housing 101 as illustrated in
(12) Response Operation with Respect to Operation of Touching Tip Portion of Housing
When a user continuously touches either one of the right and left tip portions of the opening of the housing 101 as illustrated in
An operation of touching the left tip portion of the opening of the housing 101 as illustrated in
In a configuration example in which the imaging unit 120 is disposed at the right tip portion of the opening of the housing 101, an imaging operation of the imaging unit 120 may be started as a response operation of the information processing device 100 when an operation of touching only the left tip portion of the opening of the housing 101 for a fixed period of time (the right tip portion is left released) is detected. When an operation of touching the right tip portion of the opening of the housing 101 has been performed during an imaging operation of the imaging unit 120 or when a reduction in luminance due to the blocking of a field of view of the imaging unit 120 by a user's hand or other foreign substances, or the like is detected, the imaging operation of the imaging unit 120 may be stopped or temporarily stopped as a response operation of the information processing device 100.
(13) Response Operation with Respect to Operation of Grasping Tip Portion of Housing
When a user performs an operation of grasping either one of the right and left tip portions of the opening of the housing 101 as illustrated in
It is easy to promptly perform an operation of a user grasping the tip portion of the housing 101. Consequently, an urgent processing command for an application being currently executed may be allocated as a response operation of the information processing device 100 with respect to an operation of grasping the tip portion of the housing 101 by a user.
For example, when a user promptly grasps the tip portion of the opening at a timing the user does not desire to reproduce music during music reproduction using the information processing device 100, the music reproduction is stopped, temporarily stopped, or muted as a response operation. Therefore, when a user gets into the public transportation vehicle or an elevator, the user does not have to feel embarrassed in a case where reproduced sounds resound. In addition, even when someone suddenly speaks to the user, the user can respond to a conversation without being disturbed by the reproduced sounds.
Although an operation of grasping either one of the right and left tip portions of the opening of the housing 101 by a user is promptly performed, it is not easy to promptly perform an operation of simultaneously grasping both the right and left tip portions. Consequently, when a user performs an operation of simultaneously grasping both the right and left tip portions of the opening of the housing 101, an emergency operation such as the transition of the system to a sleep state or a shutdown state may be allocated as a response operation of the information processing device 100.
(14) Response Operation with Respect to Operation of Tracing Side Surface of Housing
When a user performs an operation of tracing either one of the right and left side surfaces of the housing 101 as illustrated in
An operation of tracing the side surface of the housing 101 can be easily detected only by respectively disposing, for example, capacitance type touch sensors on the right and left side surfaces of the housing 101 and can be utilized as an input method of a user. In addition, a user can perform an input operation only by performing a simply operation of “tracing” the side surface and can perform eight input operations by combining the right and left side surfaces of the housing 101 and tracing directions with each other. Therefore, it is also possible to set a rule preferred by a user and execute personal authentication, credit settlement, and the like by the operation thereof. Meanwhile, as illustrated in
(15) Response Operation with Respect to Operation of Shifting Wearing Position of Housing
It is easy to promptly perform an operation of shifting a wearing position of the housing 101 to the right or left by a user as illustrated in
For example, when a user promptly shifts a wearing position of the housing 101 to the right or left at a timing at which the user does not desire to reproduce music during music reproduction using the information processing device 100, the music reproduction is stopped, temporarily stopped, or muted as a response operation. Therefore, when a user gets into the public transportation vehicle or an elevator, the user does not have to feel embarrassed in a case where reproduced sounds resound. In addition, even when someone suddenly speaks to the user, the user can respond to a conversation without being disturbed by the reproduced sounds.
In addition, when a user promptly shifts a wearing position of the housing 101 to the right or left during conversation using the information processing device 100, a response process such as the termination of conversation or the setting of a holding state may be executed.
(16) Response Operation with Respect to Operation for Operator
As illustrated in
(17) Response Operation with Respect to Extension and Contraction Operations of Tip Portion of Housing
As illustrated in
As described so far, according to the technology disclosed in the present specification, a user can perform various input operations by utilizing the shape of the housing 101 of the neck band type information processing device 100. As a result, the user can select the most comfortable UI according to a situation.
The technology disclosed in the present specification has been described in detail above with reference to the specific embodiment. However, it will be obvious to those skilled in the art that modification and replacement of the embodiment can be made without departing from the scope of the technology disclosed in the present specification.
In the present specification, description has been given focusing on an embodiment in which the technology disclosed in the present specification is applied to a neck band type wearable device, but the scope of the technology disclosed in the present specification is not limited thereto. The technology disclosed in the present specification can be similarly applied to various forms of wearable devices other than a neck band. In addition, the technology disclosed in the present specification can also be applied to various types of information processing devices other than a wearable device, including tablet terminals and notebook computers.
In addition, the information processing device to which the technology disclosed in the present specification is applied is used by being worn around a user's neck, for example, as a neck band type wearable device. Naturally, the information processing device can be used by being worn on any area in the user's body other than the neck, or can also be used by being disposed on, for example, a desk without being worn on the body of the user.
In addition, the information processing device to which the technology disclosed in the present specification is applied can be used in various fields such as the fields of life logs and athletic support, for example, as a neck band type wearable device. The information processing device can provide various services, such as music reproduction during jogging of a user wearing the information processing device, capturing of a moving image or a still image, and sending of an appropriate voice guidance, to a user.
In short, the technology disclosed in the present specification has been described in an illustrative form, and the description of the present specification should not be interpreted in a limited manner. The claims should be taken into account to judge the gist of the technology disclosed in the present specification.
Additionally, the present technology may also be configured as below.
(1)
An information processing device including:
a housing which is worn by a user;
a detection unit which detects an operation performed on the housing; and
a control unit which controls a response operation with respect to a result of the detection.
(2)
The information processing device according to (1),
in which the detection unit detects a deformation operation of the housing.
(3)
The information processing device according to any of (1) or (2),
in which the housing has a U-shape, and
the detection unit detects at least one of a closing operation and an opening operation of a U-shaped opening of the housing.
(4)
The information processing device according to (3),
in which the detection unit detects an operation performed on the housing using at least one of measurement of a distance of a gap of the U-shaped opening of the housing, measurement of movement of at least one tip portion of the opening, and measurement of distortion of the housing.
(5)
The information processing device according to any of (1) to (4),
in which the detection unit detects an operation of touching the housing.
(6)
The information processing device according to (5),
in which the housing has a U-shape, and
the detection unit detects an operation of touching at least one of right and left sides of the U-shaped opening of the housing.
(7)
The information processing device according to any of (5) or (6),
in which the detection unit detects an operation of touching the housing on the basis of a detection result of a proximity sensor or a touch sensor.
(8)
The information processing device according to any of (5) or (6), further including:
at least one of a sound collecting unit and an imaging unit,
in which the detection unit detects an operation of touching a location where the sound collecting unit is disposed in response to a change in sound pressure in the sound collecting unit or detects an operation of touching a location where the imaging unit is disposed in response to a change in luminance in the imaging unit.
(9)
The information processing device according to any of (1) to (8),
in which the housing has a U-shape, and
the detection unit detects an operation of grasping at least one of the right and left tip portions of the U-shaped opening of the housing.
(10)
The information processing device according to (9), further including:
at least one of a sound collecting unit and an imaging unit,
in which the detection unit detects an operation of grasping a location where the sound collecting unit is disposed in response to a change in sound pressure in the sound collecting unit or detects an operation of grasping a location where the imaging unit is disposed in response to a change in luminance in the imaging unit.
(11)
The information processing device according to (9),
in which the detection unit detects an operation of touching the housing on the basis of a detection result of a pressure-sensitive sensor.
(12)
The information processing device according to any of (1) to (11),
in which the detection unit detects an operation of tracing a surface of the housing.
(13)
The information processing device according to (12),
in which the housing has a U-shape, and
the detection unit detects which one of right and left surfaces of the U-shaped opening of the housing has been traced and detects a tracing direction.
(14)
The information processing device according to any of (12) or (13),
in which the detection unit detects an operation of tracing the surface of the housing on the basis of a detection result of a touch sensor.
(14)
The information processing device according to any of (1) to (14),
in which the detection unit detects an operation of shifting a wearing position of the housing.
(16)
The information processing device according to (15),
in which the detection unit detects an operation of shifting the wearing position of the housing and a shifting direction on the basis of a detection result of an acceleration sensor.
(17)
The information processing device according to any of (1) to (16),
in which the housing has a U-shape,
the information processing device further comprises operators capable of performing at least one of a rotation operation and a pressing operation on right and left tip portions of the U-shaped opening of the housing, and
the detection unit detects a rotation operation or a pressing operation of the right and left operators.
(18)
The information processing device according to any of (1) to (17),
in which the housing has a U-shape, and
the detection unit detects extension and contraction operations of right and left tip portions of the U-shaped opening of the housing.
(19)
The information processing device according to any of (1) to (18),
in which the detection unit previously records features of a change in a detection value when an operation is performed on the housing, and detects an operation for the housing in response to obtaining the recorded detection value while a user is using the information processing device.
(20)
A method of controlling an information processing device, the method including:
a detection step of detecting an operation performed on a housing of the information processing device worn by a user; and
a control step of controlling a response operation of the information processing device with respect to a result of the detection.
Number | Date | Country | Kind |
---|---|---|---|
2016-248786 | Dec 2016 | JP | national |
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/JP2017/040237 | 11/8/2017 | WO | 00 |