Embodiments of the present disclosure relate to handheld terminal.
With the development of electronic technology and wireless communication technology, mobile phone has evolved, from a single-function terminal with only capabilities for phone calls and short messages, to a multi-function terminal with further capabilities for Internet surfing, office work and entertainment. At present, mobile phone users can download various apps from an app store and install them. These apps cover various aspects of life, thereby greatly enriching the functions of a mobile phone. In addition to mobile phones, various other handheld terminals, such as tablet PCs, personal digital assistants and so on, are also playing an increasingly important role in people's lives. Therefore, there is a demand to constantly enrich the functions of a handheld terminal.
The present disclosure provides a handheld terminal having a novel construction, thereby enriching the functions of the handheld terminal.
According to a first aspect of the present disclosure, a handheld terminal is provided which enables a user to see the opposite side through a portion of the handheld terminal. The handheld terminal includes a first transparent display unit disposed on one of a front side and a back side of the handheld terminal, a second transparent display unit disposed on the other of the front side and the back side, or a first and a second gesture detection units disposed on the front side and the back side respectively, a drive module including a first display drive module for the first transparent display unit, and one of the following two constituent parts: a second display drive module for the second transparent display unit, or a first and a second gesture detection drive modules for the first and second gesture detection units respectively, and a processing module configured to cause an image to be displayed on the first transparent display unit, and to perform one of the two operations of: causing the image or another image different from the image to be displayed on the second transparent display unit, or processing a gesture input detected via at least one of the first and second gesture detection units.
According to the above configuration, the handheld terminal can have a double-side display function or a double-side manipulation function, thereby improving the usability of the handheld terminal.
Optionally, according to a second aspect of the present disclosure, the handheld terminal includes the second transparent display unit and the first and second gesture detection units, the drive module further includes the other of the two constituent parts, and the processing module is further configured to perform the other of the two operations.
According to the above configuration, the handheld terminal can have both the double-side display function and the double-side manipulation function, thereby improving the usability of the handheld terminal.
Optionally, according to a third aspect of the present disclosure, the first and second transparent display units include a first and a second OLED light emitting layers respectively. The first and second transparent display units share one substrate, or include a first and a second substrates respectively, wherein the first and second substrates are combined together.
Optionally, according to a fourth aspect of the present disclosure, the first and second transparent display units each include an OLED display panel, an LCD display panel, or a projection type display panel.
Optionally, according to a fifth aspect of the present disclosure, the first and second gesture detection units each include a capacitive touch panel, a resistive touch panel, an infrared touch panel, or a somatosensory unit.
Optionally, according to a sixth aspect of the present disclosure the processing module is further configured to cause an image to be displayed on a corresponding transparent display unit according to one of a front display mode, a back display mode, and a double-side display mode.
According to the above configuration, the handheld terminal can flexibly display an image on any one of the two transparent display unit, thereby improving the usability of the handheld terminal.
Optionally, according to a seventh aspect of the present disclosure, the processing module is further configured to, according to a gesture input detected via at least one of the first and second gesture detection units, cause an appearance of an object in a to-be-displayed image to change correspondingly.
According to the above configuration, the handheld terminal enables a user to manipulate the object in the display image with different fingers on the front side and the back side of the handheld terminal simultaneously, to change the appearance of the object, thereby enhancing the interactivity between the handheld terminal and the user and expanding the use of the handheld terminal in for example entertainment games.
Optionally, according to an eighth aspect of the present disclosure, the handheld terminal further includes a sensor module for sensing information related to a rotation of the handheld terminal relative to a horizontal direction. The processing module is further configured to rotate a to-be-displayed image according to the information such that the rotated image does not rotate relative to the horizontal direction, or cause a motion state of an object in a to-be-displayed image to change correspondingly according to the information.
According to the above configuration, the handheld terminal can exhibit the same image display effect no matter how the handheld terminal rotates, or the user can interact with the object in the image displayed on the handheld terminal by rotating the handheld terminal, thereby improving the usability of the handheld terminal.
Optionally, according to a ninth aspect of the disclosure, the sensor module includes an acceleration sensor, or includes an acceleration sensor and a gyroscope.
Optionally, according to a tenth aspect of the present disclosure, the handheld terminal further includes a camera module for viewing and photographing an object, and a communication module for performing wireless communication. The camera module is configured to, under a control of the processing module, take an image of a front object when the handheld terminal is facing the object. The communication module is configured to, under a control of the processing module, transmit data which is related to the image of the object taken by the camera module, to a server having an augmented reality function, and to receive related information of the object from the server. The processing module is configured to cause the related information to be displayed on a corresponding transparent display unit.
According to the above configuration, the handheld terminal enables a user to see the opposite side through a portion of the handheld terminal. The portion can be directly used as a view finder to observe the front object, and the related information of the object can be directly superimposed on the corresponding transparent display unit, such that the handheld terminal has a novel augmented reality (AR) function.
Optionally, according to an eleventh aspect of the present disclosure, the processing module includes a graphics processor for generating the image, and an image processing module configured to cause the image to be displayed on the corresponding transparent display unit, according to one of the front display mode, the back display mode, and the double-side display mode.
Optionally, according to a twelfth aspect of the present disclosure, the processing module includes a graphics processor for generating the image, and an image processing module configured to, according to the gesture input detected via at least one of the first and second gesture detection units, cause the appearance of the object in the to-be-displayed image to change correspondingly.
Optionally, according to a thirteenth aspect of the present disclosure, the processing module includes a graphics processor for generating the image, and an image processing module configured to rotate the to-be-displayed image according to the information such that the rotated image does not rotate relative to the horizontal direction, or cause the motion state of the object in the to-be-displayed image to change correspondingly according to the information.
Optionally, according to a fourteenth aspect of the present disclosure, the processing module includes a central processor configured to control the camera module and the communication module, and to cause the related information to be displayed on the corresponding transparent display unit.
In order to more clearly illustrate the technical solution of embodiments of the present disclosure, the drawings are simply introduced below. Apparently, the schematic structure diagrams in the following drawings are not necessarily drawn to scale, but exhibit various features in a simplified form. Moreover, the drawings in the following description relate merely to some embodiments of the present disclosure, and are not intended to limit the present disclosure.
In order to make the technical solutions and advantages of embodiments of the present disclosure apparent, the technical solutions of embodiments of the present disclosure will be described clearly and completely hereinafter in conjunction with the drawings. Apparently, embodiments described herein are merely a part of but not all embodiments of the present disclosure. Based on embodiments of the present disclosure described herein, those skilled in the art can obtain other embodiments without any creative work, which should be within the scope of the present disclosure.
A handheld terminal of the present disclosure enables a user to see the opposite side through a portion of the handheld terminal, and includes a first transparent display unit disposed on one of a front side and a back side of the handheld terminal, a second transparent display unit disposed on the other of the front side and the back side, or a first and a second gesture detection units disposed on the front side and the back side respectively, a drive module including a first display drive module for the first transparent display unit, and one of the following two constituent parts: a second display drive module for the second transparent display unit, or a first and a second gesture detection drive modules for the first and second gesture detection units respectively, and a processing module configured to cause an image to be displayed on the first transparent display unit, and to perform one of the two operations of: causing the image or another image different from the image to be displayed on the second transparent display unit, or processing a gesture input detected via at least one of the first and second gesture detection units.
In the specification of the present disclosure, the handheld terminal includes, but is not limited to, a mobile phone, a tablet PC, a personal digital assistant (PDA), and the like. Moreover, for a handheld terminal having a physical keyboard, the front side of the handheld terminal may be a side provided with the physical keyboard. For a handheld terminal having no physical keyboard, the front side of the handheld terminal may be a side provided with a home key. However, the present disclosure is not limited thereto. The front side and the back side of a handheld terminal may also be arbitrarily set. Hereinafter, the handheld terminal of the present disclosure will be specifically described with reference to four embodiments.
1. Structure of Transparent Portion
However, Embodiment 1 of the present disclosure is not limited to the example shown in
In the example of
The back transparent display unit 102′ includes a substrate 106 and a backward OLED light emitting layer 108′ disposed on the substrate 106. That is, in this example, the forward and backward transparent display units 102, 102′ share one substrate 106. The backward OLED light emitting layer 108′ includes an anode 110′, a hole transport layer 112′, an organic light emitting layer 114′, an electron transport layer 116′, and a cathode 118′. These components may be identical to the components of the forward OLED light emitting layer 108, and may be different from the components of the forward OLED light emitting layer 108 in terms of preparation material, and will not be detailed here again.
The method for manufacturing an assembly of the forward and backward transparent display units 102, 102′ includes the following steps. Firstly, the substrate 106 is prepared. Then, the forward and backward OLED light emitting layers 108, 108′ are formed on the front side and the back side of the substrate 106 respectively. The forward and backward OLED light emitting layers 108, 108′ may be formed simultaneously, or may also be formed one after another. This step may be implemented by using any existing techniques for forming a transparent OLED light emitting layer. For example, for a small molecule organic light emitting material, a vacuum thermal evaporation process may be employed. For a high molecule organic light emitting material, a spin coating or spray printing process may be employed.
The example of
In the example of
2. System Composition of Handheld Terminal
The forward display drive module 210 may drive the forward transparent display unit 202 to display an image. The image to be displayed may come from the processing module 214. In the example of
However, the disclosure is not limited to the example described above. As described above, the forward and backward transparent display units may each include an OLED display panel, an LCD display panel, or a projection type display panel, and may be the same type or different types of display panels. Accordingly, when at least one of the forward and backward transparent display units is an LCD display panel or a projection type display panel, its corresponding display drive module may be implemented by using any existing techniques for driving an LCD display panel or a projection type display panel to display an image.
The forward gesture detection drive module 212 may drive the forward gesture detection unit 204 to detect a gesture input by a user. In the example of
However, the disclosure is not limited to the example described above. As described above, the forward and backward gesture detection units may each include a capacitive touch panel, a resistive touch panel, an infrared touch panel, or a somatosensory unit, and may be the same type or different types of gesture detection units. Accordingly, when at least one of the forward and backward gesture detection units is a somatosensory unit, its corresponding gesture detection drive module may be implemented by using any existing techniques for driving a somatosensory unit. For example, the corresponding gesture detection drive module may be a drive module for an optical sensor used in somatosensory techniques such as EyeToy of Sony and Kinect of Microsoft.
The processing module 214 may be configured to cause an image to be displayed on one of the forward and backward transparent display units 202, 202′ such that the image or another image different from the image is displayed on the other of the forward and backward transparent display units 202, 202′, and to process a gesture input detected via at least one of the forward and backward gesture detection units 204, 204′. The other functions of the processing module 214 will be further described in the following “Double-side display function”, “Double-side manipulation function”, “Rotation function” and “Augmented reality function”. In addition, the processing module 214 may be located in the non-transparent portion of the handheld terminal.
As an example, the processing module 214 may include a graphics processor 216 for generating an image, and an image processing module 218. The graphics processor 216 may employ, for example, a commercially available graphics processor for a handheld terminal. The image processing module 218 may be configured to cause the image generated by the graphics processor 216, to be displayed on one of the forward and backward transparent display unit 202, 202′, to cause the image or another image generated by the graphics processor 216 different from the image, to be displayed on the other of the forward and backward transparent display units 202, 202′, and to process a gesture input detected via at least one of the forward and backward gesture detection units 204, 204′. The image processing module 218 may be implemented as an application specific integrated circuit (ASIC), a field programmable gate array (FPGA), or the like. In this example, optionally, the processing module 214 may further include a central processor for controlling the following mentioned optional modules, e.g., the sensor module, camera module, communication module, or the like.
As another example, the processing module 214 may be a central processor configured to perform the functions of the graphics processor 216 and the image processing module 218. The central processor may employ, for example, a commercially available central processor for a handheld terminal.
As still another example, the processing module 214 may include a graphics processor 216 for generating an image, and a central processor. The central processor is configured to perform the functions of the image processing module 218, and may employ, for example, a commercially available central processor for a handheld terminal.
Optionally, the handheld terminal may further include a sensor module 220 for sensing information related to a rotation of the handheld terminal relative to a horizontal direction. The sensor module 220 may include an N-axis acceleration sensor (N is for example 2, 3, or 6). The acceleration sensor may employ, for example, a commercially available accelerometer chip such as accelerometer chips from Freescale, Invensense, or the like. Optionally, the sensor module 220 may also include an N-axis gyroscope (N is for example 3). The gyroscope may employ, for example, a commercially available gyroscope chip such as a gyroscope chip from STMicroelectronics, Freescale, or the like. The use of the sensor module 220 will be further described in the following “Rotation function” section. The sensor module 220 may be located in the non-transparent portion of the handheld terminal.
Optionally, the handheld terminal may also include a camera module 222 for viewing and photographing an object, and a communication module 224 for performing wireless communication. The camera module 222 may employ, for example, a commercially available camera module for a handheld terminal. The communication module 224 may employ, for example, a commercially available communication module for a mobile phone. The use of the camera module 222 and the communication module 224 will be further described in the following “Augmented reality function” section. The camera module 222 and the communication module 224 may be located in the non-transparent portion of the handheld terminal.
However, the system composition of Embodiment 1 of the present disclosure is not limited thereto. Optionally, depending on the requirement, the handheld terminal may also include other modules (e.g., audio processing modules, speakers, or the like) which are typically used for mobile phones, tablet PCs, or the like.
3. Double-Side Display Function
Optionally, the processing module 214 may be configured to cause an image to be displayed on a corresponding transparent display unit according to one of a front display mode, a back display mode, and a double-side display mode. As described hereinbefore, this function may be performed by the image processing module 218 or the central processor.
In one example, the three display modes may be set by a physical switch installed on the handheld terminal. The user can set a corresponding display mode by moving the physical switch to one of three positions. In another example, these three display modes may be set in a setup menu of the operating system of the handheld terminal. In still another example, both setting modes described above may be present at the same time.
The above functions may be implemented by the processing module 214 (e.g., the image processing module 218 or the central processor) by performing the following process. Firstly, the current image display mode is detected. For example, the current image display mode may be represented by a mode flag stored in a memory of the handheld terminal. Then, an image data to be displayed is transmitted to a corresponding display drive module according to the current image display mode, such that the image is displayed on the corresponding transparent display unit.
Optionally, the double-side display mode may be subdivided into a double-side synchronous display mode and a double-side asynchronous display mode. In the double-side synchronous display mode, different users located respectively on the front side and the back side of the handheld terminal can see the same image displayed by the handheld terminal. In the double-side asynchronous display mode, different users located respectively on the front side and the back side of the handheld terminal can see different images displayed by the handheld terminal. For example, two different users located respectively on the front side and the back side can simultaneously touch the touch panel on the side corresponding thereto, to manipulate different apps installed on the handheld terminal.
In the double-side asynchronous display mode described above, the processing module 214 (e.g., the graphics processor 216 or the central processor) may generate an image for a corresponding transparent display unit, according to gesture inputs detected via different gesture detection units, and add a corresponding flag in the image data of the image to identify the corresponding transparent display unit. Then, the processing module 214 (e.g., the image processing module 218 or the central processor) may transmit the image data of the to-be-displayed image to a corresponding display drive module according to the flag.
4. Double-Side Manipulation Function
Optionally, the processing module 214 may be configured to, according to a gesture input detected via at least one of the forward and backward gesture detection units 210, 210′, cause an appearance of an object in a to-be-displayed image to change correspondingly. As described hereinbefore, this function may be performed by the image processing module 218 or the central processor.
Similar to the double-side display function, the double-side manipulation function can also be divided into a front manipulation mode, a back manipulation mode and a double-side manipulation mode. A user can set a corresponding manipulation mode by a physical switch, or may also set a corresponding manipulation mode in a setup menu of the operating system of the handheld terminal.
The object in the to-be-displayed image may include, but not limited to, an object in an image of a game app, an object in a user interface, or the like. The appearance of the object may include, but not limited to, the shape of the object, the concave/convex state, positions of part of constituent elements, or the like. As an example, in a game app, the user can press a balloon displayed on the handheld terminal with different fingers on both sides of the handheld terminal simultaneously, such that a corresponding recess appears at the pressed position of the balloon, and the recess appearing on the back side of the balloon may be shown by using a perspective view, thereby increasing the interactivity between the user and the game app and expanding the use of the handheld terminal in entertainment games. As another example, in a poker game app, the user may use different fingers to swipe a set of displayed playing cards in two opposite directions on both sides of the handheld terminal respectively, such that the set of playing cards are fanned out as in a real environment, thereby enhancing the game's verisimilitude and interestingness. As a further example, in a user interface for adjusting the volume of the handheld terminal, the user can move a pointer of a circular knob displayed on the handheld terminal with different fingers on both sides of the handheld terminal simultaneously, thereby adjusting the volume to a desired magnitude. The portions of the circular knob other than the pointer may be in a transparent state, so as to facilitate the user's manipulation from both sides simultaneously.
The functions described above may be implemented by the processing module 214 (e.g., the image processing module 218 or the central processor) by performing the following process. Firstly, the processing module 214 (e.g., the image processing module 218 or the central processor) may determine which preset input condition in the targeted application is met by the gesture input detected via at least one of the forward and backward gesture detection units 210, 210′. Then, the processing module 214 (e.g., the graphics processor 216 or the central processor) may generate a corresponding image specified in the application according to the determined input condition. For example, the application may prepare a corresponding image material in advance for each preset input condition. When a preset input condition is satisfied, its corresponding image material can be used to generate a corresponding image.
5. Rotation Function
Optionally, the processing module 214 may be configured to rotate a to-be-displayed image according to the information which is related to a rotation of the handheld terminal relative to a horizontal direction and is sensed by the sensor module 220, such that the rotated image does not rotate relative to the horizontal directions, or cause a motion state of an object in a to-be-displayed image to change correspondingly according to the information.
5.1 Image Rotation Function
Corresponding to the double-side display function, the image rotation function may implement a front image rotation, a back image rotation, and a double-side image rotation.
The image rotation function may be implemented by the processing module 214 (e.g., the image processing module 218 or the central processor) by performing the following process. Firstly, the rotation angle and the rotation direction may be determined based on the information. When the sensor module 220 includes an N-axis acceleration sensor (N is for example 2, 3 or 6), the sensed information related to the rotation of the handheld terminal relative to the horizontal direction includes the accelerations of the handheld terminal in N axes. The processing module 214 (e.g., the image processing module 218 or the central processor) may perform low pass filtering on the acceleration data of the N axes to obtain the components of the gravitational acceleration on the N axes, and calculate the rotation angle of the x-axis or y-axis of the display plane of the handheld terminal relative to the horizontal direction according to these components. For example, the algorithms for determining the rotation angle of the handheld terminal in Android operating system may be employed. Optionally, when the sensor module 220 further includes an N-axis gyroscope (N is for example 3), the sensed information related to the rotation of the handheld terminal relative to the horizontal direction further includes angular velocities of the handheld terminal relative to the N axes. The processing module 214 (e.g., the image processing module 218 or the central processor) may more accurately calculate the rotation angle of the x-axis or y-axis of the display plane of the handheld terminal relative to the horizontal direction, by any existing accelerometer/gyroscope fusion algorithm (e.g., the relevant algorithms in Android or iOS operating system). The step of determining the rotation angle may be periodically performed at a certain time interval. The processing module 214 (e.g., the image processing module 218 or the central processor) may determine the rotation direction by determining a change in the rotation angle between adjacent time intervals. However, the present disclosure is not limited thereto. The rotation direction may also be determined by using any existing algorithms for determining the rotation direction of the handheld terminal (e.g., the relevant algorithms in Android or iOS operating system).
Then, the processing module 214 (e.g., the image processing module 218 or the central processor) may rotate the to-be-displayed image according to the rotation angle and the rotation direction, such that the rotated image does not rotate relative to the horizontal direction. The to-be-displayed image includes, but not limited to, a desktop of a handheld terminal, an interface of a web browser, an interface of a text browser, an interface of a media player, or the like. For example, the processing module 214 (e.g., the image processing module 218 or the central processor) may rotate the to-be-displayed image by an angle that is opposite to the rotation angle. This step may be implemented by using any existing image rotation algorithms. Likewise, this step may be performed periodically at a certain time interval. In this way, no matter how the handheld terminal rotates, the to-be-displayed image can be kept in real time in such a state without rotation relative to the horizontal direction, thereby facilitating the user's viewing.
5.2 Rotation Manipulation Function
The rotation manipulation function may be implemented by the processing module 214 (e.g., the image processing module 218 or the central processor) by performing the following process. Firstly, the rotation angle and/or rotation direction may be determined based on the information. The rotation angle and/or rotation direction may be determined as described in section 5.1, and will not be detailed here again.
Then, the motion state of the object in the to-be-displayed image may be changed correspondingly according to the rotation angle and/or rotation direction. The object may include, but not limited to, an object in an image of a game application, an object in a user interface, or the like. The change in the motion state may include, but not limited to, a change from static state to moving state, a change in the moving direction/velocity, or the like. As an example, in a game app, the rolling direction/velocity of an object (e.g., a water drop, a ball) in a game image may be changed by rotating the handheld terminal.
For example, the processing module 214 (e.g., the image processing module 218 or the central processor) may determine which preset input condition in the targeted application is met by the rotation angle and/or rotation direction, and then generate a corresponding image specified in the application according to the determined input condition. For example, the application may prepare a corresponding image material in advance for each preset input condition. When a preset input condition is satisfied, its corresponding image material may be used to generate a corresponding image, such that the motion state of the object in the displayed image is changed.
6. Augmented Reality Function
Optionally, the camera module 220 may be configured to, under the control of the processing module 214 (e.g., the central processor), take an image of a front object when the handheld terminal is facing the object. The communication module 224 may be configured to, under the control of the processing module 214 (e.g., the central processor), transmit data which is related to the image of the object taken by the camera module 224, to a server having an augmented reality function, and to receive related information of the object from the server. The processing module 214 (e.g., the center processor) may be configured to cause the related information to be displayed on a corresponding transparent display unit.
As an example, the user may turn on the augmented reality mode by a physical switch installed on the handheld terminal, or may also turn on the augmented reality mode through a setting menu of the operating system of the handheld terminal. The processing module 214 (e.g., the central processor) may determine whether the augmented reality mode is currently turned on by detecting an augmented reality flag stored in a memory of the handheld terminal. When in the augmented reality mode, the processing module 214 (e.g., the central processor) may control the camera module 220 to track a target object by using a view finder of the camera module 220, and to take an image of the object. The image taken may be stored, for example, in a memory of the handheld terminal.
Then, the processing module 214 (e.g., the central processor) may control the communication module 224 to transmit data related to the image of the object to a server with augmented reality function, and to receive related information of the object from the server. The data related to the image of the object may be, for example, raw data of the image, or may also be data compressed or preprocessed by the processing module 214 (e.g., the central processor). For example, the data may be obtained by performing a feature extraction algorithm. The server with augmented reality function may be implemented by using any existing augmented reality technologies. For example, the server may be a database server on cloud side which may match the received image related data of the object with the data in the database, to obtain the related information of the object.
Then, the processing module 214 (e.g., the central processor) may cause the related information to be displayed on a corresponding transparent display unit. In this way, the transparent portion of the handheld terminal can be directly used as a view finder to observe the front object, and the related information of the object can be directly superimposed on the corresponding transparent display unit, such that the handheld terminal has a novel augmented reality function.
In summary, the handheld terminal according to Embodiment 1 of the present disclosure has a novel double-side display function, a novel double-side manipulation function, a novel rotation function and a novel augmented reality function due to its novel construction, thereby greatly improving the usability of the handheld terminal.
1. Structure of Transparent Portion
2. System Composition of Handheld Terminal
Accordingly, the system composition of the handheld terminal according to Embodiment 2 of the present disclosure is substantially the same as that of the handheld terminal of Embodiment 1, except that the handheld terminal of Embodiment 2 is provided with only one gesture detection drive module. Specifically, in the case of
3. Double-Side Display Function
Accordingly, the double-side display function of the handheld terminal according to Embodiment 2 of the present disclosure is substantially the same as that of the handheld terminal of Embodiment 1, except that in the double-side asynchronous display mode of the handheld terminal of Embodiment 2, the user located on one side may manipulate the handheld terminal with fingers such that different users located on both sides can see different images, but two different users located respectively on the front side and the back side cannot touch the touch panel on the corresponding side simultaneously to manipulate different apps installed on the handheld terminal.
4. Manipulation Function
Since the handheld terminal according to Embodiment 2 of the present disclosure is provided with a gesture detection unit only on one side, it has only the one-side manipulation function and does not have the double-side manipulation function of the handheld terminal of Embodiment 1.
5. Rotation Function and Augmented Reality Function
The handheld terminal of Embodiment 2 of the present disclosure may have the same rotation function and augmented reality function as those of the handheld terminal of Embodiment 1.
1. Structure of Transparent Portion
2. System Composition of Handheld Terminal
Accordingly, the system composition of the handheld terminal according to Embodiment 3 of the present disclosure is substantially the same as that of the handheld terminal of Embodiment 1, except that the handheld terminal of Embodiment 3 does not have a gesture detection drive module.
3. Double-Side Display Function
Accordingly, the double-side display function of the handheld terminal according to Embodiment 3 of the present disclosure is substantially the same as that of the handheld terminal of Embodiment 1, except that in the double-side asynchronous display mode of the handheld terminal of Embodiment 3, two different users located respectively on the front side and the back sides of the handheld terminal cannot touch the touch panel on the corresponding side simultaneously to manipulate different apps installed on the handheld terminal. Optionally, a user located on one side may manipulate the handheld terminal by using, for example, a physical keyboard or button or the like, so as to enable different users located on both sides to see different images.
4. Manipulation Function
Since the handheld terminal according to Embodiment 3 of the present disclosure does not have a gesture detection unit, it does not have the double-side manipulation function of the handheld terminal of Embodiment 1.
5. Rotation Function and Augmented Reality Function
The handheld terminal of Embodiment 3 of the present disclosure may have the same rotation function and augmented reality function as the handheld terminal of Embodiment 1.
1. Structure of Transparent Portion
2. System Composition of Handheld Terminal
Accordingly, the system composition of the handheld terminal according to Embodiment 4 of the present disclosure is substantially the same as that of the handheld terminal of Embodiment 1, except that the handheld terminal of Embodiment 4 is provided with only one display drive module. Specifically, in the case of
3. Display Function
Accordingly, since the handheld terminal according to Embodiment 4 of the present disclosure is provided with only one transparent display unit, it does not have the double-side display function of Embodiment 1, and has only the one-side display function.
4. Double-Side Manipulation Function, Rotation Function and Augmented Reality Function
The handheld terminal of Embodiment 4 of the present disclosure may have the same double-side manipulation function, rotation function and augmented reality function as those of the handheld terminal of Embodiment 1.
The present disclosure has been described above by means of four embodiments.
It should be noted that the embodiments described above are merely exemplary embodiments of the present disclosure, but are not used to limit the protection scope of the present disclosure. The protection scope of the present disclosure should be defined by the appended claims.
Number | Date | Country | Kind |
---|---|---|---|
201510680612.4 | Oct 2015 | CN | national |
This application is a National Stage Entry of PCT/CN2016/088812 filed Jul. 6, 2016, which claims the benefit and priority of Chinese Patent Application No. 201510680612.4, filed on Oct. 19, 2015, the disclosures of which are incorporated herein in their entirety as part of the present application.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/CN2016/088812 | 7/6/2016 | WO | 00 |