This application is based upon and claims the benefit of priority from Japanese Patent Application No. 2013-177877, filed Aug. 29, 2013, the entire contents of which are incorporated herein by reference.
Embodiments described herein relate generally to a technique for controlling display and operation on a touch panel.
There is a computer that adopts, as an input device, a multi-touch panel that detects a plurality of touches. There is a tabletop computer that adopts, as a table top, the touch panel increased in size. The tabletop computer allows simultaneous operation by a large number of people and enables the people to hold a meeting and a presentation.
A user brings a fingertip or a nib into contact with an image region displayed on the touch panel and slides the fingertip or the nib. The image moves according to the operation. The user can perform rotation, enlargement, reduction, and the like of the image by bringing a plurality of fingers or nibs into contact with an image and performing predetermined gesture operation.
In such a tabletop computer, the number of people who can stand in a regular position with respect to a displayed image (a position where a vertical state of an image, a character, and the like is recognized as a correct direction) is limited. In a small meeting of about two people, users often sit to face each other across the tabletop computer. When the users have a meeting in this facing state, if image display of the tabletop computer is set in a regular direction for one user, the image display is in a vertically reversed direction for the other user. The user not present in the regular direction with respect to the image has poor visibility and poor operability.
Embodiments have been devised to solve the problems and it is an object of the embodiments to provide a technique for preventing visibility of an image from being deteriorated irrespective of in which direction the image is viewed.
An information processing apparatus according to an embodiment includes a touch panel display and a control section. The touch panel display includes a polarizing filter. The control section controls display of the touch panel display such that display content visually recognized via the polarizing filter is displayed in a regular direction in each of positions of a plurality of users present around side surfaces of the touch panel display when a display operation surface of the touch panel display is directed in an upward direction.
The information processing apparatus (a computer) in this embodiment displays an image viewed from right opposite positions for each of users according to positions where the users are present. The information processing apparatus reflects operation on the displayed image in the right opposite positions according to standing positions of the users.
A form of a first embodiment is explained below with reference to the drawings.
In the touch panel display 50, a multi-touch sensor (an input section), which simultaneously detects a plurality of contact positions, is stacked and arranged on a panel-type display section. An image on a screen can be controlled by a fingertip or a nib. The touch panel display 50 enables display of various content images. The touch panel display 50 also plays a role of a user interface for an operation input.
On a surface layer of an operation surface of the touch panel display 50, a lenticular lens 51 (see
The processor 10 is an arithmetic processing unit such as a CPU (Central Processing Unit). The processor 10 loads computer programs stored in the ROM 30, the HDD 40, and the like to the DRAM 20 and arithmetically executes the computer programs to perform various kinds of processing according to the computer programs. The DRAM 20 is a volatile main storage device. The ROM 30 is a nonvolatile storage device that permanently stores the computer programs. A BIOS (Basic Input Output System) and the like used during a system start are stored. The HDD 40 is a nonvolatile auxiliary storage device capable of permanently storing the computer programs. The HDD 40 stores data and computer programs to be used by a user.
The touch panel display 50 includes an input section of a touch panel and a display section of a flat panel. The touch panel is adapted to multi-touch for detecting a plurality of simultaneous contacts. The touch panel can obtain coordinate values (an x value and a y value) corresponding to a contact position. The flat panel includes light-emitting elements for display over an entire panel surface. The touch panel display 50 includes the lenticular lens 51 on an upper layer thereof.
The network I/F 60 is a unit that performs communication with an external apparatus. The network I/F 60 includes a LAN (Local Area Network) board. The network I/F 60 includes a device conforming to a short-range radio communication standard and a connector conforming to a USB (Universal Serial Bus) standard.
The sensor unit 70 includes sensors 70A to 70D explained below. The sensor unit 70 is a unit that detects an ID (Identification) card owned by the user and reads information stored in the ID card. The read information is used for login authentication and the like for the tabletop information processing apparatus 100. The ID card is a noncontact IC card. At least identification information of the user is stored in the ID card.
The camera 80 is located above the touch panel display 50 and arranged to set a downward direction as an image pickup direction. The camera 80 picks up an image of the entire surface of the touch panel display 50. The arrangement of the camera 80 is explained below.
The tabletop information processing apparatus 100 displays a screen for holding a meeting or the like to the users who finish the authentication. The users perform document editing, browsing of materials and Web pages, and the like on the screen. Movement, enlargement, reduction, rotation, selection, deletion, and the like of these objects to be displayed (a displayed image and an aggregate of data tied to the image are referred to as objects) can be performed according to predetermined operation of the users using a publicly-known technology.
In
It is explained below how the touch panel display 50 is seen when visually recognized via the lenticular lens 51. A control method of the touch panel display 50 is also explained.
The users can view only the black-shaded lines B when visually recognizing the touch panel display 50 from one direction (e.g., a solid line arrow shown in
The processor 10 controls the display of the touch panel display 50 such that displayed content visually recognized via the lenticular lens 51 is displayed in the regular direction for each of the user A and the user B.
If a reference point and the directions of the x axis and the y axis of the touch panel display 50 are as shown in
The lines B are displayed in the regular direction when being displayed as they are. However, the lines A need to be subjected to coordinate conversion and displayed. The coordinate conversion is explained with reference to
If the display control is performed such that the objects A and B are displayed in the regular direction for each of the users in this way, problems that occur when only display is controlled are explained. As an example, a situation in which the user A is about to move the object B is shown in
A reason for the above is explained with reference to
Actually, the substantial object A is present in a position that the user touches determining that the object is the object B (a broken line rectangle). Therefore, if the user A moves the touch position, the substantial object A moves following the touch position. Consequently, the user B visually recognizing the substantial object A sees as if the object A moves rather than the object B. In this case, according to the display conversion, the unsubstantial object A moves if the substantial object A moves. Consequently, the user A sees as if the object A moves. Therefore, even if the user A performs operation for moving the object B, both of the user A and the user B see as if the object A moves. In the case illustrated above, the substantial object A is present in the position that the user A touches determining that the object is the object B. However, if no object is present in the position that the user A touches determining that the object is the object B, both the objects A and B do not move.
In this embodiment, to solve this problem, not only the display conversion but also conversion of a touch position (a contact position) is performed. An example of a conversion method for the touch position is explained with reference to
If the user B touches the substantial object, the user B directly touches the substantial object. Therefore, both of the display conversion and the coordinate conversion for the touch position are unnecessary. Therefore, the processor 10 needs to determine which of the users present in the four sides of the table of the tabletop information processing apparatus 100 touches the object, and to control, according to a result of the determination, whether the touch position is to be converted or not to be converted. To determine whether the user present in which position of the table performs operation, in this embodiment, an image of the touch panel display 50 is picked up using the camera 80. This configuration example is shown in
The camera 80 picks up an image of the touch panel display 50 on a real-time basis. If the processor 10 detects contact on the touch panel display 50, the processor 10 acquires a picked-up image at the time of detection of the contact from the camera 80. The processor 10 specifies a touch position in the acquired picked-up image. The processor 10 determines, using the image processing in the past such as edge detection processing, from which of the four sides of the table an edge line extends to the touch position. Consequently, the processor 10 can specify from which of the four sides of the table an arm enters. Depending on a state of the picked-up image, the processor 10 detects entering of a plurality of arms. However, the processor 10 can specify the touch position. Therefore, by specifying an edge line extending from the detected touch position (or the vicinity of the touch position) out of the plurality of arms (edge lines), the processor 10 can specify from which of the four sides of the table the arm enters and comes into contact with the touch position.
In the example explained above, the method of vertically reversing the coordinate value of the user B present in the position where the user B can regularly visually recognize the objects and converting the coordinate value such that the user A can also regularly visually recognize the objects is explained. However, it is also possible to cause users C and D shown in
The processor 10 determines whether a display switching mode is ON (ACT 001). The processing stays on standby until the display switching mode is turned on (ACT 001, a loop of No). The switching of the mode is performed by pressing a predetermined button displayed on the touch panel display 50. If the mode is turned on (ACT 001, Yes), the processor 10 specifies a position of a user (ACT 002). The user position is specified according to, for example, which of the sensors 70A to 70D detects an ID card owned by the user.
The processor 10 switches, according to the method explained above and using the coordinate conversion formula explained above, display of objects to be displayed in the regular direction respectively in specified directions (ACT 003).
If the touch panel display 50 detects contact (a touch), the processor 10 acquires a present picked-up image from the camera 80 (ACT 005) and acquires a coordinate value of a position where the contact is performed on the touch panel display 50 (ACT 006). The processor 10 determines, on the basis of the picked-up image and the contact coordinate value, whether an arm enters from a regular position (ACT 007). The regular position is a position where a vertical direction can be correctly visually recognized even in a normal display state before the switching of the mode. If the arm enters from the regular position (ACT 007, Yes), the processor 10 does not convert the touch position (ACT 008). On the other hand, if the arm does not enter from the regular position (ACT 007, No), the processor 10 further determines from which direction the arm enters. The processor 10 carries out conversion of the touch position according to the direction (ACT 009).
The processor 10 determines whether a substantial object is present in a coordinate after the conversion (if the conversion is unnecessary, the touch position) (ACT 010). If the substantial object is present (ACT 010, Yes), the processor 10 renders the substantial object again to move to the contact position and renders to an unsubstantial object again (ACT 011) If the substantial object is absent (ACT 010, No), the processor 10 proceeds to ACT 012.
ACT 006 to ACT 011 are repeatedly performed until the user releases the fingertip or the nib, that is, the contact with the touch panel display 50 is released (ACT 012, a loop of No). According to this repeated processing, if the user moves the fingertip or the nib, the objects also move following a moving position of the fingertip or the nib. The processor 10 repeatedly executes ACT 004 to ACT 012 until the mode is turned off (Act 013, a loop of No). If the mode is turned off, the processing ends.
In the first embodiment, the objects are displayed in the regular direction with respect to the users. The display positions of the objects are controlled to coincide with each other with respect to the users. In a second embodiment, only the directions of the objects are controlled to be the regular direction with respect to the users.
How the objects are seen from the user A facing the user B in a state of such arrangement is shown in
In the second embodiment, the processor 10 controls the display of the lines A (see
In a form of the second embodiment, coordinate values of the display positions of the objects do not change. Therefore, if operation for moving the objects is performed, the conversion of the touch position explained in the first embodiment is unnecessary. Therefore, the processing for detecting which of users touches the object, the mechanism for the detection, and the like are also unnecessary. Each of the objects is rotated at 90 degrees or 270 degrees and displayed for the user C and the user D not present in the facing positions. Therefore, the objects are displayed in the regular direction for the users.
The tabletop information processing apparatus 100 may switch and carry out the form of the first embodiment and the form of the second embodiment according to switching of a mode.
In the embodiments, the movement of the objects is mentioned. However, the embodiments can also be applied to rotation, enlargement, and reduction of the objects.
As in the embodiments, by setting the vertical direction of the display in the regular direction, not only the display but also a character input can be performed in the regular direction. For example, in an input of the number of print copies of an object, if a user in the opposite direction manually inputs, for example, “print 16 copies”, it is possible to prevent a situation in which the number of print copies is printed as 91 copies by mistake.
In the embodiments, the forms of the tabletop information apparatus are explained. However, forms of the embodiments are not limited to the forms of the tabletop information apparatus. The embodiments only have to be a computer including a touch panel display such as a tablet computer.
In the embodiments, an implementation example is explained in which the camera is set above the touch panel display and the position of the touching user is specified using the camera. Besides, various implementations are conceivable such as a method of setting the users as image pickup targets and detecting motions of the users and a configuration including a human body communication function. In the case of the human body communication function, the human body communication function is imparted to an ID card owned by a user, a chair on which the user is seated, and the like. If a fingertip of the user comes into contact with the touch panel display 50, identification information of the ID card owned by the user can be acquired using the human body as a transmission medium. The processor 10 specifies a position of the touching user on the basis of the identification information and information detected by the sensors 70A to 70D. The ID card may be hung from a neck or may be stored in a pocket. Naturally, the tabletop information processing apparatus 100 needs to include a unit that enables the human body communication.
A control section is equivalent to a component including at least the processor 10, the DRAM 20, and a communication bus 90 in the embodiment. A computer program operating in cooperation with the respective kinds of hardware such as the processor 10, the DRAM 20, and the communication bus 90 is stored in the HDD 40 (or the ROM 30) beforehand, loaded to the DRAM 20 by the processor 10, and arithmetically executed. A detecting section is equivalent to the sensor unit 70. A polarizing filter is equivalent to the lenticular lens 51.
A computer program for causing a computer to execute the functions explained in the embodiments may be provided. The computer program may be referred to by any name such as display control program, user interface program, device control program, and the like.
In the embodiments, the function for carrying out the invention is recorded in the apparatus in advance. However, the same function may be downloaded from a network to the apparatus. The same function stored in a recording medium may be installed in the apparatus. A form of the recording medium may be any form as long as the recording medium is a recording medium that can store a computer program and readable by the apparatus such as a CD-ROM. The function obtained by the installation or the download in advance may be realized in cooperation with an OS (operating system) or the like in the apparatus.
As explained above in detail, irrespective of in which direction a user is present, it is possible to perform regular display and prevent visibility of the user from being deteriorated.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the inventions. Indeed, the novel embodiments described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions, and changes in the form of the embodiments described herein may be made without departing from the spirit of the inventions. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the inventions.
Number | Date | Country | Kind |
---|---|---|---|
2013-177877 | Aug 2013 | JP | national |