The present disclosure relates to an information processing apparatus, an information processing method, and a computer program product.
In recent years, a multifunctional information processing apparatus such as a mobile phone has been developed. For this reason, a user can obtain a variety of information using various functions mounted to the information processing apparatus. For example, the user displays a map on a screen of the information processing apparatus and obtains specific information (for example, peripheral information of a position shown by a keyword) by searching the keyword.
Japanese Patent Application Publication No. 2009-150839 discloses a navigation apparatus that enables path display, map display, and other display adaptable to a change in the movement speed.
However, in the related art, when specific information on a map is selected, is necessary for the user to perform an operation for setting a selection range. For this reason, a method of automatically setting a selection range in accordance with the intention of the user without performing the operation for setting the selection range by the user should be realized to improve convenience of the user.
It is desirable to provide a method of automatically setting a selection range for selecting a selection object on a map in accordance with the intention of a user.
In an information processing apparatus embodiment, the apparatus includes a control unit that determines content to be displayed within an object range on a map; and an action recognition processing unit that detects a user-related action, wherein
Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
The following description will be made in the order described below.
3. Functional Configuration of Information Processing Apparatus according to Present Disclosure
An outline of an information processing apparatus 100 according to the present disclosure will be described with reference to
The information processing apparatus 100 is a portable terminal such as a mobile phone or a PDA that is used by a user. The information processing apparatus 100 has a function of displaying map information on a screen and selecting peripheral information.
As illustrated in
The input unit 110 has a function of receiving an input of operation information from a user of the information processing apparatus 100. The input unit 110 is configured using an input device such as a switch, a button, a touch panel, a keyboard, and a mouse.
The display unit 120 has a function of displaying a variety of information, on the basis of control from the control unit 160. For example, the display unit 120 displays map information. The display unit 120 is configured using a display device such as a liquid crystal display, a plasma display, and an organic EL display.
The communication unit 130 is a communication interface that has a function as a transmitting unit and a receiving unit performing communication with an external apparatus, on the basis of the control from the control unit 160. The communication unit 130 is configured using a communication device such as a wired or wireless LAN, a communication card for Bluetooth, a router for communication, and a modem for communication.
The position information detecting unit 140 that is a global positioning system (GPS) receiver performs wireless communication with the external apparatus and detects position information (information such as latitude and longitude) of the information processing apparatus 100. For example, the GPS receiver receives data showing orbit information and time information from a plurality of GPS satellites and detects a position of the information processing apparatus 100, on the basis of the information shown by the data or a lag of an arrival time of the data. The position information detecting unit 140 is not limited to the GPS receiver. For example, the position information detecting unit 140 may receive information from an access point of Wi-Fi or a radio frequency identification (RFID) system and detect the position of the information processing apparatus 100. The position information detecting unit 140 may receive data showing information regarding a cell where the information processing apparatus 100 is positioned, from a base station of a mobile phone network of the information processing apparatus 100 (mobile phone) and detect the position of the information processing apparatus 100.
The motion sensor 142 detects information regarding motion or a state of the information processing apparatus 100. For example, a triaxial acceleration sensor (including an acceleration sensor, a gravity detection sensor, and a fall detection sensor) or a triaxial gyro sensor (including an angular velocity sensor, a blurring correction sensor, and a geomagnetic sensor) is used as the motion sensor 142.
The storage unit 150 has a function of storing a variety of information used by the control unit 160. For example, the storage unit 150 stores position information that is acquired by the position information detecting unit 140 and motion information that is acquired by the motion sensor 142. The storage unit 150 is configured using a storage device such as a magnetic storage device, a semiconductor storage device, and an optical storage device.
The control unit 160 has a function of controlling an entire operation of the information processing apparatus 100. For example, the control unit 160 can control the operation of the information processing apparatus 100, on the basis of the operation information output from the input unit 110, the position information acquired by the position information detecting unit 140, and the motion information acquired by the motion sensor 142. The control unit 160 includes a CPU, a ROM, and a RAM.
As described above, the information processing apparatus 100 has the function of displaying the map information and selecting the peripheral information.
Before describing the display of the map information according to the present disclosure, display of map information according to examples will be described with reference to
As such, in the examples illustrated in
In order to meet the request for realizing the method, the information processing apparatus 100 according to the present disclosure automatically sets a selection range for selecting a selection object (peripheral information) on the map, according to the motion information acquired by the motion sensor 142 and current position information acquired by the position information detecting unit 140.
A functional configuration of the information processing apparatus 100 will be described with reference to
As illustrated in
The position data acquiring unit 210 acquires position data of the information processing apparatus 100 (user). The position data acquiring unit 210 acquires the position data of the information processing apparatus 100, on the basis of the position information detected by the position information detecting unit 140. For example, the position data acquiring unit 210 acquires current position information of the information processing apparatus 100 (user). The position data acquiring unit 210 outputs the acquired position data to the movement information calculating unit 230 and the motion recognition processing unit 240.
The operation data acquiring unit 220 acquires operation data regarding motion or a state of the information processing apparatus 100. The operation data acquiring unit 220 acquires the operation data regarding the motion or the state of the information processing apparatus 100, on the basis of the information detected by the motion sensor 142. The motion data acquiring unit 220 outputs the acquired operation data to the motion recognition processing unit 240.
The movement information calculating unit 230 calculates the movement information of the information processing apparatus 100 (user), on the basis of the position data input from the position data acquiring unit 210. Specifically, when the user who carries the information processing apparatus 100 moves, the movement information calculating unit 230 calculates a movement direction or a movement speed (that is, movement vector) of the user, on the basis of the position data. The movement information calculating unit 230 outputs the calculated movement information to an object range determining unit 250.
The motion recognition processing unit 240 recognizes the motion of the user who carries the information processing apparatus 100. The motion recognition processing unit 240 recognizes the motion of the user of the information processing apparatus 100, on the basis of the position data input from the position data acquiring unit 210 and the operation data input from the operation data acquiring unit 220, and acquires motion information of the user. The motion recognition processing unit 240 outputs the acquired motion information to the object range determining unit 250 or the peripheral information selecting unit 270.
The motion recognition processing unit 240 can identify a type of each motion, when the user takes a plurality of motions. For example, the motion recognition processing unit 240 identifies whether the user is walking, running, or is on a vehicle such as a train. The motion recognition processing unit 240 can calculate the duration of one motion as motion information of the user. For example, the motion recognition processing unit 240 calculates a time when the user is walking. When the user who carries the information processing apparatus 100 moves in an elevator in a structure having a plurality of floors, the motion recognition processing unit 240 can recognize that the elevator ascends or descends and acquire information regarding ascending or descending of the elevator.
The object range determining unit 250 sets an object range (corresponding to a selection range) to select a selection object on the map, according to the motion information acquired by the motion recognition processing unit 240 and the current position information acquired by the position data acquiring unit 210. In this case, the object range is automatically set on the basis of the motion information and the current position information of the user. For this reason, even though the user does not perform the operation for setting the object range, a useful object range that corresponds to a context (information such as the position information and the motion information in which the intention of the user is reflected) in which the user is put is set.
For example, when the object range determining unit 250 sets the object range, the object range determining unit 250 can change a shape showing an area of the object range. In this case, the object range determining unit 250 can select the area of the object range, according to the motion of the user.
The object range determining unit 250 can change the shape showing the area of the object range, according to the movement vector. Thereby, because the object range according to a movement direction or a movement speed of the user is set, an object range mismatched with a movement aspect of the user can be prevented from being set.
The object range determining unit 250 can set the object range such that the current position shown by the acquired current position information and the center of the object range become different positions. Thereby, even when the movement speed of the user is fast, an object range based on the spot of the movement destination can be set according to the movement speed.
The object range determining unit 250 can change the shape showing the area of the object range, according to the duration of the motion of the user. Thereby, an object range suitable for the motion can be set by changing the width of the object range according to the motion of the user.
For example, if the duration of the motion is lengthened, the object range determining unit 250 sets the area of the object range narrowly. Thereby, when the user does not move, information suitable for the state of the user can be provided by limiting (or excluding) the peripheral information little by little.
For example, if the duration of the motion is lengthened, the object range determining unit 250 sets the area of the object range widely. That is, the area of the object range is not widened until the duration of the motion is lengthened. Thereby, as compared with the case in which the object range is set widely by only slight movement of the user, peripheral information regarding remote regions can be prevented from being selected, even though the user moves.
When the elevator is ascending, the object range determining unit 250 can set floors higher than a floor where the elevator is positioned as the object range, and when the elevator is descending, the object range determining unit 250 can set floors lower than the floor where the elevator is positioned as the object range. In this case, because peripheral information of floors corresponding to a movement direction of the user in a structure is provided to the user, peripheral information of floors in a direction opposite to the movement direction of the user can be prevented from being provided.
The object range determining unit 250 can set the object range such that the current position shown by the acquired current position information is not included in the object range. Thereby, even when the movement speed of the user is fast, the object range based on the spot of the movement destination can be set according to the movement speed.
The object range determining unit 250 can set the object range such that the position of the motion destination of the user becomes the center of the object range. When the user gets on the vehicle and the movement speed is fast, the object range determining unit 250 can set the position of the movement destination as the center of the object range and set the object range to include peripheral information of the spot matched with the intention of the user.
For example, if the movement speed of the user increases, the object range determining unit 250 can set the area of the object range widely and if the movement speed of the user decreases, the object range determining unit 250 can set the area of the object range narrowly. Thereby, because the object range can be set adaptively according to the movement speed of the user, the object range determining unit 250 can set the object range to include peripheral information of the spot matched with the intention of the user.
The peripheral information acquiring unit 260 acquires the peripheral information on the map to be displayed on the screen of the display unit 120. For example, the peripheral information acquiring unit 260 acquires data regarding the peripheral information from an external apparatus (server), through the communication unit 130. The peripheral information acquiring unit 260 outputs the acquired peripheral information to the peripheral information selecting unit 270.
The peripheral information selecting unit 270 selects the selection object from the object range set by the object range determining unit 250. The peripheral information selecting unit 270 can select a selection object that corresponds to a type of the motion of the user. For example, the peripheral information selecting unit 270 selects the selection object from the object range, to confine the peripheral information matched with a category of the motion of the user. The peripheral information selecting unit 270 outputs the confined peripheral information to the peripheral information providing unit 280.
The peripheral information providing unit 280 provides the selection object according to the intention of the user among the selection objects displayed on the display unit 120 to the screen, on the basis of the selection information input from the peripheral information selecting unit 270. When the user is moving, the peripheral information providing unit 280 automatically updates the object range and switches the display.
Next, specific examples of setting of the object range by the object range determining unit 250 will be described using the following embodiments.
A first embodiment of setting of the object range will be described with reference to
In the first embodiment, the width of the object range is different according to the motion of the user (i.e., a type of user-related action). Another type of user-related action is interpretation of information (such as text based messaging sent over wired or wireless networks) contained in messages sent via the communication unit 130 (
When the user is walking or running, the user can move in all directions. Therefore, the object range determining unit 250 sets a shape of the object range to a shape of a circle that spreads equally from the center. In this case, the center corresponds to the current position P of the user. For this reason, the object range mismatched with the motion of the user can be prevented from being set.
According to the first embodiment described above, the object range suitable for the motion is set by changing the width of the object range according to the motion of the user and the user can perceive the peripheral information suitable for the motion.
In the above description, the area of the object range has the circular shape. However, the present disclosure is not limited thereto. The object range determining unit 250 can set the shape of the area of the object range to various shapes, according to the motion of the user.
A method of adjusting the elliptical object range 340 according to the modification will be described with reference to
{right arrow over (Q)}={right arrow over (P)}+k{right arrow over (V)} (k: any constant) Expression (1)
a=c
1
|{right arrow over (V)}| (c1: any constant) Expression (2)
b=c
2
|{right arrow over (V)}| (c2: any constant) Expression (3)
k|{right arrow over (V)}|<b Expression(4)
θ=tan−1(Vy/Vx) (vx,Vy:X and Y components of {right arrow over (V)}) Expression (5)
In this case, a shows a minor axis of an ellipse, b shows a major axis of the ellipse, and θ shows inclination. In addition, a vector P shows a current position of the user and the vector Q shows the center of an object range.
Thereby, the object range that includes the current position and is matched with the movement direction or the movement speed of the user can be calculated.
The object range determining unit 250 may adjust the width of the object range, according to the duration of one motion.
As such, it is effective to adjust the object range according to the duration of the motion in the following points. That is, as compared with the case in which the object range is set widely by only the slight movement of the user, peripheral information regarding remote regions can be prevented from being selected, because the object range is not widened until a constant time passes, even though the user moves. As a result, excessive information can be prevented from being provided to the user.
A second embodiment of setting of the object range will be described with reference to
Even in the second embodiment, the width of the object range is different according to the motion of the user. Although the object range set when the user moves has been described in the first embodiment, an object range set when the user is stopped will be described in the second embodiment. As illustrated in
As such, the reason why the object range of the specific radius is set even though the user is stopped in the second embodiment is that the user who is stopped may move somewhere thereafter. Because the possibility of the user moving when the user is standing is higher than the possibility of the user moving when the user is seated, the radius R3 is set to be smaller than the radius R4 (which is smaller than the radius R1 illustrated in
In the second embodiment described above, the object range suitable for the motion is set by changing the width of the object range according to the motion of the user and the user can perceive the peripheral information suitable for the motion.
However, even in the second embodiment, the object range determining unit 250 may adjust the width of the object range, according to the duration of one motion.
That is, as illustrated in
As such, the object range is narrowed when the duration of the motion is lengthened and the peripheral information is limited (or excluded) little by little when the user stops and does not move anywhere, such that information suitable for a state of the user can be provided.
A third embodiment of setting of the object range will be described with reference to
In the third embodiment, setting of the object range in a case where the user gets on an elevator, which is an example of an elevating object, and ascends and descends in a structure will be described. In this case, it is assumed that the structure has nine floors from a first floor (1F) to a ninth floor (9F).
When the motion recognition processing unit 240 recognizes that the user is on the elevator and is ascending in the structure (ascending state), the object range determining unit 250 sets all of floors higher than a floor where the elevator is positioned as an object range 410. For example, as shown in
Meanwhile, when the motion recognition processing unit 240 recognizes that the user is on the elevator and is descending in the structure (descending state), the object range determining unit 250 sets all of floors lower than a floor where the elevator is positioned as an object range 420. For example, as shown in
The object range determining unit 250 can update the object range, according to the position of the moving elevator. Thereby, the user can perceive peripheral information suitable for the position of the elevator in real time.
According to the third embodiment described above, because the peripheral information of the floors corresponding to an ascending/descending direction of the elevating object the user is in the structure is provided to the user, the peripheral information of the floors in the direction opposite to the movement direction of the user can be prevented from being provided. As a result, peripheral information according to the intention of the user can be provided. In the above description, the elevator is used as the elevating object. However, even when the elevating object is an escalator, the same effect can be achieved.
A fourth embodiment of setting of the object range will be described with reference to
In the fourth embodiment, setting of the object range in a case where the user gets on a train and moves will be described. For example, when the motion recognition processing unit 240 recognizes that the user is on a train T and is moving, the object range determining unit 250 sets an object range 510 of a shape of a circle of a radius R5 around a next stop S1 in a route, as illustrated at a left side of
The example of the case in which the movement direction of the train T can be specified has been described. However, when the movement direction of the train T may not be specified, the object range determining unit 250 may set an object range 520 of a shape of a circle of a radius R5 around stops S2 and S3 of both sides adjacent to the next stop S1, as illustrated at a middle side of
However, when the speed is fast constantly, the train is more likely to be a Shinkansen bullet train, a limited express train, or an express train, a station where the train stops may be positioned at a relatively big city, and a range of peripheral information which the user desires to obtain may be widened. Therefore, in this case, the object range determining unit 250 may greatly set a radius of an object range 530 based on a next stop S4, as illustrated at a right side of
The setting of the object range according to the fourth embodiment described above is effective when it can be known that the train T the user is on stops at a specific stop and the user can appropriately perceive peripheral information of a stop.
The train has been described as the example. However, the present disclosure is not limited thereto. Any vehicle that stops at a specific place may be applied. For example, as illustrated in
A fifth embodiment of setting of the object range will be described with reference to
In the fifth embodiment, setting of the object range when the user gets on a car will be described. For example, when the motion recognition processing unit 240 recognizes that the user is on a car C and is moving, the object range determining unit 250 sets an object range 550 having a long rectangular shape toward the front side of a movement direction of the car C, as illustrated at the left side of
In this case, a method of setting the object range 550 is the same as the method of setting the object range 430 in
The rectangular object range has been set. However, when the movement direction of the car may not be specified, the object range determining unit 250 may set a square object range. As illustrated at the right side of
The setting of the object range according to the fifth embodiment described above is effective when it can be known that the car the user is on moves in only the specific place and the user can appropriately perceive peripheral information of a specific spot.
The car has been described as the example. However, the present disclosure is not limited thereto. For example, as illustrated in
The setting of the object range when the motion recognition processing unit 240 recognizes that the user is moving or is stopped as the motion recognition (such motion recognition is called low-level motion recognition to simplify the description) has been described. Hereinafter, confinement of a selection object in an object range that is set by performing high-level motion recognition by the motion recognition processing unit 240 will be described.
For example, when it is recognized that the user is eating by the high-level motion recognition by the motion recognition processing unit 240, the peripheral information selecting unit 270 selects the meal-related facilities in the object range 610 and emphasizes and displays the meal-related facilities, as illustrated at a left side of
As such, peripheral information in which the intention of the user is reflected can be provided by selecting only the peripheral information associated with the types of motions of the user in the object range. That is, because a large amount of peripheral information may be included in the object range set by the low-level motion recognition, excessive information can be prevented from being provided to the user, by confining the peripheral information by the high-level motion recognition.
An operation of the information processing apparatus 100 when an object range is set and a selection object is confined will be described with reference to
This processing is realized by executing a program stored in a ROM by a CPU of the control unit 160. The executed program may be stored in recording media such as a compact disk (CD), a digital versatile disk (DVD), and a memory card and may be downloaded from a server through the Internet.
First, the motion recognition processing unit 240 executes low-level motion recognition processing (step S102). For example, the motion recognition processing unit 240 recognizes whether the user is moving or is stopped, on the basis of the position data input from the position data acquiring unit 210 and the operation data input from the operation data acquiring unit 220.
Next, the motion recognition processing unit 240 determines whether the motion has changed to the given motion (step S104). When it is determined in step S240 that the motion has changed to the given motion (Yes), the object range determining unit 250 recalculates an object range (step S106). That is, the object range determining unit 250 changes the magnitude of a radius of the object range.
Next, the peripheral information acquiring unit 260 acquires peripheral information (step S108). The peripheral information acquiring unit 260 acquires data regarding the peripheral information from an external apparatus (server), through the communication unit 130.
When it is determined in step S104 that the motion has not changed to the given motion (No), the object range determining unit 250 maintains the previously set object range. Then, processing of step S108 is executed.
After the processing of step S108, the motion recognition processing unit 240 executes high-level motion recognition processing (step S110) For example, the motion recognition processing unit 240 recognizes the detailed motion of the user such as whether the user is eating or shopping.
Next, the peripheral information selecting unit 270 confines the peripheral information (step S112). The peripheral information selecting unit 270 selects the peripheral information corresponding to the motion identified by the high-level motion recognition processing, from the set object range. For example, when the user is shopping, the peripheral information selecting unit 270 selects the shopping-related facilities from the object range.
The peripheral information providing unit 280 displays the peripheral information confined by the peripheral information selecting unit 270 on the map displayed on the screen. Thereby, the user can separately perceive the peripheral information in which the motion of the user is reflected in the object range.
As described above, the information processing apparatus 100 acquires the motion information of the user and acquires the current position information. In addition, the information processing apparatus 100 sets the object range for selecting the selection object on the map, according to the acquired motion information and the acquired current position information.
According to this configuration, the object range is automatically set on the basis of the motion information and the current position information of the user. For this reason, even though the user does not perform the operation for setting the selection range, a useful selection range that corresponds to a context (information such as the position information and the motion information in which the intention of the user is reflected) in which the user is put is set.
It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
The steps illustrated in the flowchart according to the embodiments may include processing executed temporally according to the order described and processing executed in parallel or individually. In the steps that are processed temporally, the order may be appropriately changed in some cases.
Additionally, the following configurations are also within the technical scope of the present disclosure.
In an information processing apparatus embodiment, the apparatus includes
According to one aspect of the embodiment,
According to another aspect of the embodiment, the apparatus further includes
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment, the apparatus further includes
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to an information processing method embodiment, the method includes
According to one aspect of the embodiment, the method further includes
According to another aspect of the embodiment, the method further includes
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment,
According to another aspect of the embodiment, the method further includes
According to a non-transitory computer readable storage medium embodiment, the storage medium has computer readable instructions stored therein that when executed by a processing circuit perform a method, the method includes
Number | Date | Country | Kind |
---|---|---|---|
2011-223848 | Oct 2011 | JP | national |
This application is a continuation of U.S. patent application Ser. No. 13/591,276, filed Aug. 22, 2012, which contains subject matter related to that disclosed in Japanese Priority Patent Application JP 2011-223848 filed in the Japan Patent Office on Oct. 11, 2011. The entire contents of these applications are hereby incorporated by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 13591276 | Aug 2012 | US |
Child | 15287482 | US |