This application relates to detection of screen orientation, and, in particular, detection of screen orientation for a mobile device by using one or more proximity sensors.
Mobile devices often include a display screen (i.e., an electronic display) to facilitate user interaction with the device. It is generally desirable that the content or information (e.g., text or graphics) displayed on the screen be oriented properly for the user. For example, if the screen is displaying text, the text should be oriented such that the user can easily read it from his/her vantage point. That is, the text should be oriented such that the top of the display screen corresponds to the top of the displayed text from that user's perspective. However, because the device—and therefore the screen—can be rotated and oriented in various ways, information is not always properly oriented for viewing by the user. Thus, it is desirable for the orientation of the displayed content to be able to change and adapt as the user moves the device.
Conventional mobile devices use an accelerometer such as a gravity sensor to control the orientation of the displayed content. That is, the accelerometer detects changes in the orientation of the mobile device, and the information displayed is rotated in response to the detected changes. For example, if the device is rotated clockwise by the user, the accelerometer detects the rotation and the information displayed is rotated counterclockwise to maintain the same orientation relative to the user. Data from the accelerometer thereby controls the display orientation for the device.
Accelerometers can, however, result in display rotation or orientation that is not proper for viewing by the user or that is unintended by the user.
Described below are techniques and tools for detecting screen orientation by using one or more proximity sensors that address some of the shortcomings of conventional devices. For example, using one or more proximity sensors to detect screen orientation can reduce unintended display rotation. One advantage is that the manner in which a user holds a device can be used to determine a display mode of the device.
In one embodiment, a mobile device comprises one or more proximity sensors configured to detect whether an object is proximate to the sensor. In some examples, the one or more proximity sensors are located on a side surface of the device, while in other examples the sensors are located near to a front surface of the device. The mobile device also comprises a display screen on its front surface, and the display mode of the screen is changed based on whether or not one or more of the proximity sensors detects an object. That is, the one or more proximity sensors are located on the device so as to detect screen orientation. For example, the proximity sensors can be located to detect common ways in which a user could hold or position the device, and the device can be configured to change the display mode such that the display is oriented as intended by the user.
This summary is provided to introduce a selection of concepts in a simplified form that is further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
The foregoing and additional features and advantages will become more apparent from the following detailed description, which proceeds with reference to the accompanying figures.
The illustrated mobile device 100 can include a controller or processor 110 (e.g., signal processor, microprocessor, ASIC, or other control and processing logic circuitry) for performing such tasks as signal coding, data processing, input/output processing, power control, and/or other functions. An operating system 112 can control the allocation and usage of the components 102 and support for one or more application programs 114. The application programs can include common mobile computing applications (e.g., email applications, calendars, contact managers, web browsers, messaging applications), or any other computing application.
The illustrated mobile device 100 can include memory 120. Memory 120 can include non-removable memory 122 and/or removable memory 124. The non-removable memory 122 can include RAM, ROM, flash memory, a hard disk, or other well-known memory storage technologies. The removable memory 124 can include flash memory or a Subscriber Identity Module (SIM) card, which is well known in GSM communication systems, or other well-known memory storage technologies, such as “smart cards.” The memory 120 can be used for storing data and/or code for running the operating system 112 and the applications 114. Example data can include web pages, text, images, sound files, video data, or other data sets to be sent to and/or received from one or more network servers or other devices via one or more wired or wireless networks. The memory 120 can be used to store a subscriber identifier, such as an International Mobile Subscriber Identity (IMSI), and an equipment identifier, such as an International Mobile Equipment Identifier (IMEI). Such identifiers can be transmitted to a network server to identify users and equipment.
The mobile device 100 can support one or more input devices 130, such as a touchscreen 132 (e.g., capable of capturing finger tap inputs, finger gesture inputs, or keystroke inputs for a virtual keyboard or keypad), microphone 134, camera 136, physical keyboard 138 and/or trackball 140 and one or more output devices 150, such as a speaker 152 and a display screen (i.e., electronic display) 154. Other possible output devices (not shown) can include piezoelectric or other haptic output devices. Some devices can serve more than one input/output function. For example, touchscreen 132 and screen 154 can be combined in a single input/output device.
A wireless modem 160 can be coupled to one or more antennas (not shown) and can support two-way communications between the processor 110 and external devices, as is well understood in the art. The modem 160 is shown generically and can include a cellular modem for communicating at long range with the mobile communication network 104, a Bluetooth-compatible modem 164, or a Wi-Fi compatible modem 162 for communicating at short range with an external Bluetooth-equipped device or a local wireless data network or router. The wireless modem 160 is typically configured for communication with one or more cellular networks, such as a GSM network for data and voice communications within a single cellular network, between cellular networks, or between the mobile device and a public switched telephone network (PSTN).
The mobile device 100 supports at least one proximity sensor 188 for detecting the orientation of the display screen 154 using tools and techniques described herein. For example, the proximity sensor 188 can be configured to provide the operating system 112 with input regarding whether an object is proximate to the proximity sensor 188. In response, the operating system 112 can change the display mode of the screen 154. The mobile device 100 can include one or more proximity sensors in addition to proximity sensor 188 for use with other functions of the mobile device besides detecting screen orientation. The mobile device can support an optional accelerometer 186, such as a gravity sensor. The mobile device can be configured to detect orientation of the display screen 154 using the proximity sensor 188 in addition to or instead of the accelerometer 186. For example, the operating system 112 can change the display mode of the screen 154 based on information received from both the proximity sensor 188 and accelerometer 186.
The mobile device can further include at least one input/output port 180, a power supply 182, a satellite navigation system receiver 184, such as a Global Positioning System (GPS) receiver, and/or a physical connector 190, which can be a USB port, IEEE 1394 (FireWire) port, and/or RS-232 port. The illustrated components 102 are not all required or all-inclusive, as the components shown can be deleted and other components can be added.
The mobile device 100 can be part of an implementation environment in which various types of services (e.g., computing services) are provided by a computing “cloud” (see, for example,
As described herein, proximity sensors can be used to detect screen orientation of a mobile device. Such proximity sensors can be any proximity sensor known in the art. In general, a proximity sensor is capable of sensing the proximity of an object to the sensor. That is, such sensors can detect the presence of an object without physical contact. However, as used herein, proximity sensors can either alternatively or additionally sense the presence of an object based on physical contact. Example proximity sensors can be inductive, capacitive, optical, acoustic, photoelectric, or magnetic proximity sensors, or proximity sensors can use capacitive, resistive, or other touchscreen technologies. In one implementation, proximity sensors can be infrared (IR) sensors that emit IR light and detect reflected IR light in order to sense proximity of an object.
Proximity sensors can be capable of sensing various objects. For example, the object sensed can be a person or a part of a person (e.g., a hand) or the object can be non-human (e.g., a table or other object). The sensed object may or may not be in physical contact with the proximity sensor or with a surface of the mobile device associated with the sensor. Typically, the sensed object is within 2 inches of the mobile device surface or the proximity sensor. However, proximity sensors can be configured to detect objects within only 0.5 inches, within 1 inch, or within 1.5 inches.
Proximity sensors described herein can be used in addition to or instead of an accelerometer to detect screen orientation and control display mode. If proximity sensors are used in addition to an accelerometer, either can be set as a default. For example, proximity sensors can override accelerometer signals in all or in certain circumstances, such as when in conflict. However, mobile devices can be configured to use only proximity sensors to detect screen orientation and to control display mode.
Referring to
The operating system 320 processes the input 310 and can determine the orientation of a display screen, or electronic display, associated with the system 300. Based on the received indication 310, the operating system 320 issues a command 330 to change the display mode of the associated display screen. Exemplary display modes include a portrait mode and a landscape mode as described herein. However, a display mode can be any other orientation of the displayed content. For example, multiple display modes can be defined based on incremental rotations from a reference mode, such as four display modes defined at 0°, 90°, 180°, and 270° of rotation, or six display modes defined at 0°, 60°, 120°, 180°, 240°, and 300° of rotation.
In some embodiments described herein, the indication 310 can be from a first proximity sensor(s), and the operating system 320 can also receive an indication that an object is not proximate to a second proximity sensor(s). That is, lack of detection of an object by the second proximity sensor(s) (e.g., an object is not being detected proximate to that sensor) can also be considered input. For example, if a proximity sensor is not providing an indication that an object is proximate to the proximity sensor, this is an indication that an object is not proximate to that proximity sensor. In these embodiments, the command 330 to change the display mode of the display screen is based on both the indication from the first proximity sensor(s) and the indication from the second proximity sensor(s).
In practice, the system 300 can be more complicated, with additional inputs, outputs and the like.
The proximity sensors 420, 422, 424, 426 are shown to have a particular size for purposes of illustration, however, the sensors can be smaller or larger. In addition, each of the sensors can be divided into multiple regions, and positioned differently along the perimeter 414. For example, each of the sensors 420, 422, 424, 426 are shown to have a length that is approximately one half the length of each of the edges 440, 442, 444, 446, respectively. However, the sensors can be longer, having lengths approximately equal to the edge length, three fourths of the edge length, or other fraction of the edge length. Further, the sensors can be shorter, having lengths less than one half the edge length, such as approximately one third, one quarter, one eighth, one sixteenth, or less of the edge length. For example, the sensors can be small approximately circular sensors each having a diameter of less than approximately one twentieth of the edge length. Likewise, the sensors 420, 422, 424, 426 are shown centered on the respective edges (i.e., the center of the sensor is at the approximate center of the respective edge). However, sensors can be positioned closer to the corners of the device 400. Further, two or more proximity sensors can be positioned along a single edge. For example, one or more of the sensors 420, 422, 424, 426 can be a series of small approximately circular sensors spaced from each other along the respective edge.
Further, the mobile device 400 can have fewer or more proximity sensors, or it can have any combination of the proximity sensors 420, 422, 424, 426. For example, the mobile device 400 can have a proximity sensor 428 on a back surface in addition to or instead of other proximity sensors. Further, the mobile device 400 can have only a pair of sensors, such as sensor 420 and sensor 426, sensor 422 and sensor 424, or any other combination.
Although the sensors 420, 422, 424, 426 are shown to be in contact with the perimeter 414, such contact is not required. In general, the sensors 420, 422, 424, 426 are part of the mobile device 400 and situated such that objects proximate to the perimeter can be sensed by the proximity sensors. For example, sensor 420 can be positioned so as to detect objects proximate to the edge 440, sensor 422 can be positioned so as to detect objects proximate to the edge 442, sensor 424 can be positioned so as to detect objects proximate to the edge 444, and sensor 426 can be positioned so as to detect objects proximate to the edge 446. In addition, two or more proximity sensors can be positioned along a single edge, so as to detect objects proximate to that edge. In some embodiments, one or more of the sensors 420, 422, 424, 426 can be configured to discern the portion of the sensor the object is proximate to, or whether multiple objects are proximate to the sensor. For example, the sensors could discern whether one finger (e.g., a thumb) or several fingers (e.g., the pointer, middle, and index fingers) are in contact with the sensor.
The mobile device 400 can be rotated or oriented in various ways. Thus, the display screen 410 can also be rotated and oriented in various ways. As described herein, the proximity sensors 420, 422, 424, 426 can be used to detect the orientation of the screen 410, and the device 400 can change the display mode of the device correspondingly so that the content being displayed on the screen 410 can be properly viewed by a user. The proper display mode depends on which edge of the screen is determined to be the top of the screen 410 for purposes of viewing. For example, the content should be displayed in portrait mode if edge 460 (or 466) is determined to be the top of the content displayed on the screen 410. In general, portrait mode is the display mode where the top and bottom of the displayed content correspond with the shorter edges (e.g., either 460 or 466) of the display screen. Likewise, the content should be displayed in landscape mode if edge 462 (or 464) is determined to be the top. In general, landscape mode is the display mode where the top and bottom of the displayed content corresponds with the longer edges (e.g., either 462 or 464) of the display screen.
Although the mobile device 400 is shown to have longer edges 462 and 464 and shorter edges 460 and 466, the display screen can be square in shape (i.e., all edges are approximately the same length), in which case the landscape and portrait modes are indistinguishable. In this case, the mobile device 400 can be configured to switch between four display modes corresponding to 0°, 90°, 180°, and 270° of rotation of the displayed content (e.g., the 0° display mode can indicate that the edge 460 corresponds to the top of the displayed content, the 90° display mode can indicate that the edge 462 corresponds to the top of the displayed content, the 180° display mode can indicate that the edge 466 corresponds to the top of the displayed content, and the 270° display mode can indicate that the edge 464 corresponds to the top of the displayed content). The display mode of the device is based on which of the four edges is determined to be the top edge of the screen. That is, the orientation of the display should be such that the top of the display corresponds to the edge that proximity sensor(s) detect to be the top edge of the device or screen.
Although mobile devices are shown in the figures and described in this application as having a particular shape, this is merely for purposes of illustration. A person of ordinary skill in the art would understand that tools and techniques described herein can be applied to devices of any shape. For example, proximity sensors can be positioned on any shape device (e.g., circular, or any other geometric or polygon shape) so as to detect objects near to one or more of the edges or sides of the device. Further, the display screen may or may not correspond to the shape of the device. For example, the device can be rectangular with a square display, or the device can be circular with a circular or rectangular display.
In some implementations of the device 500A, the activation of sensors 520 and 526 can trigger an entertainment mode. Specifically, a function of one or more buttons near to the sensors 520 and 526, such as buttons 530 and 532, can be suppressed, disabled or otherwise changed. For example, any combination of the following can be part of the entertainment mode: the ringer can be disabled, a radio can be turned off, incoming phone calls or text messages can be disallowed, the volume can be turned up or otherwise changed or locked, back and/or search buttons can be disabled or suppressed, and Bluetooth can be disabled for audio and hands-free calls. Further, buttons, such as buttons 530 and 532, can be completely disabled, or the buttons' functions can be suppressed, such as by making it more difficult to trigger the function associated with the buttons. For example, a user may have to press button 532 more than once, press it harder, or press and hold it, in order to trigger its function when the mobile device is in entertainment mode. The user may need to trigger a change in the display mode in order to re-activate the functions suppressed or disabled by the entertainment mode.
Entertainment mode can be useful when a user is playing a game, viewing photos, watching a movie, or engaging in any other activity on his/her mobile device where minimal interruption is desired. Often such activities, like playing a game or watching a movie, occur when the device is held such that sensors 520 and 526 are activated by an individual's hands. It can therefore be desirable to suppress or change functioning of buttons, such as buttons 532 and 530, so that the user does not accidentally press these buttons and interrupt the game or movie.
Referring to
In
In some implementations of the device 500C, the device can include proximity sensors in addition to the back sensor 528, such as sensors 520, 522, 524, 526, or combinations thereof. If one or more of these sensors is activated at the same time as the back sensor 528, the device 500C can be configured to ignore the sensor 528. For example, this implementation presumes that activation of a sensor on one or more of the edges 540, 542, 544, 546 is more strongly indicative of the orientation of the screen 510 than activation of the back sensor 528. In another implementation of the device 500C, when the back sensor 528 and one or more of the additional proximity sensors 520, 522, 524, 526 are activated at the same time, the device 500C can ignore all proximity sensors. That is, the device can be configured to utilize data from an accelerometer to determine screen orientation and to control the display mode instead of data from proximity sensors.
In
In
The device 700 is rectangular in shape with the side surface 750 having four portions: two long sides and two short sides. (However, the device 700 can also be square in shape, in which case all four sides would have approximately the same length.) The sensor 724 is positioned on one of the long-side portions 744 and can be configured to detect the proximity of objects to the long-side portion 744. The sensor 726 is positioned on one of the short-side portions 746 and can be configured to detect the proximity of objects to the short-side portion 746. Although only two proximity sensors 724, 726 are shown, the device 700 can include additional proximity sensors. For example, the device 700 can include an additional proximity sensor on the short-side portion 740 and the long-side portion 742. Further, the mobile device 700 can have fewer proximity sensors, or it can have any combination of the illustrated proximity sensors. For example, the mobile device 700 can have a proximity sensor on its back surface in addition to or instead of other proximity sensors. Further, the mobile device 700 can have only a pair of sensors, such as a pair of short-side portion sensors, a pair of long-side portion sensors, or any other combination.
With reference to
Using proximity sensors in addition to or instead of an accelerometer to detect screen orientation and to control display mode can, in some implementations, have advantages. For example, because proximity sensors can be used to detect how a user is holding the device with a display screen (e.g., by detecting what portions of the device are being held/contacted by the user), proximity sensors can be more accurate at determining the display orientation intended by the user. Users frequently hold a mobile device in the same manner when a particular display mode is desired. By detecting the method of holding the mobile device and by changing the display mode accordingly, unintended changes in display orientation can be avoided. For example, any subsequent device rotation or changes in orientation detected by the accelerometer can be ignored while the device is being held in the same manner. By contrast, in a conventional device using only an accelerometer, the display mode would change based on the subsequent device rotation despite the user maintaining the same manner of holding the device. Such a result would likely be undesired by the user.
In example environment 1000, various types of services (e.g., computing services) are provided by a cloud 1010. For example, the cloud 1010 can comprise a collection of computing devices, which may be located centrally or distributed, that provide cloud-based services to various types of users and devices connected via a network such as the Internet. The implementation environment 1000 can be used in different ways to accomplish computing tasks. For example, some tasks (e.g., processing user input and presenting a user interface) can be performed on local computing devices (e.g., connected devices 1030, 1040, 1050) while other tasks (e.g., storage of data to be used in subsequent processing) can be performed in the cloud 1010.
In example environment 1000, the cloud 1010 provides services for connected devices 1030, 1040, 1050 with a variety of screen capabilities. One or more of the connected devices 1030, 1040, 1050 can be configured as described herein to detect screen orientation by using proximity sensors and to control a display mode based on input from the proximity sensors. Connected device 1030 represents a device with a computer screen 1035 (e.g., a mid-size screen). For example, connected device 1030 could be a personal computer such as desktop computer, laptop, notebook, netbook, or the like. Connected device 1040 represents a device with a mobile device screen 1045 (e.g., a small size screen). For example, connected device 1040 could be a mobile phone, smart phone, personal digital assistant, tablet computer, or the like. Connected device 1050 represents a device with a large screen 1055. For example, connected device 1050 could be a television screen (e.g., a smart television) or another device connected to a television (e.g., a set-top box or gaming console) or the like.
One or more of the connected devices 1030, 1040, 1050 can include touchscreen capabilities. Touchscreens can accept input in different ways. For example, capacitive touchscreens detect touch input when an object (e.g., a fingertip or stylus) distorts or interrupts an electrical current running across the surface. As another example, touchscreens can use optical sensors to detect touch input when beams from the optical sensors are interrupted. Physical contact with the surface of the screen is not necessary for input to be detected by some touchscreens. Devices without screen capabilities also can be used in example environment 1000. For example, the cloud 1010 can provide services for one or more computers (e.g., server computers) without displays.
Services can be provided by the cloud 1010 through service providers 1020, or through other providers of online services (not depicted). For example, cloud services can be customized to the screen size, display capability, and/or touchscreen capability of a particular connected device (e.g., connected devices 1030, 1040, 1050).
In example environment 1000, the cloud 1010 provides the technologies and solutions described herein to the various connected devices 1030, 1040, 1050 using, at least in part, the service providers 1020. For example, the service providers 1020 can provide a centralized solution for various cloud-based services. Further, screen orientation and display mode information based on proximity sensors described herein can be transferred via the cloud 1010 as part of various services. The service providers 1020 can manage service subscriptions for users and/or devices (e.g., for the connected devices 1030, 1040, 1050 and/or their respective users).
Although the operations of some of the disclosed methods are described in a particular, sequential order for convenient presentation, it should be understood that this manner of description encompasses rearrangement, unless a particular ordering is required by specific language set forth below. For example, operations described sequentially may in some cases be rearranged or performed concurrently. Moreover, for the sake of simplicity, the attached figures may not show the various ways in which the disclosed methods can be used in conjunction with other methods.
Any of the disclosed methods can be implemented as computer-executable instructions stored on one or more computer-readable storage media (e.g., non-transitory computer-readable media, such as one or more optical media discs, volatile memory components (such as DRAM or SRAM), or nonvolatile memory components (such as hard drives)) and executed on a computer (e.g., any commercially available computer, including smart phones or other mobile devices that include computing hardware). Any of the computer-executable instructions for implementing the disclosed techniques as well as any data created and used during implementation of the disclosed embodiments can be stored on one or more computer-readable media (e.g., non-transitory computer-readable media). The computer-executable instructions can be part of, for example, a dedicated software application or a software application that is accessed or downloaded via a web browser or other software application (such as a remote computing application). Such software can be executed, for example, on a single local computer (e.g., any suitable commercially available computer) or in a network environment (e.g., via the Internet, a wide-area network, a local-area network, a client-server network (such as a cloud computing network), or other such network) using one or more network computers.
For clarity, only certain selected aspects of the software-based implementations are described. Other details that are well known in the art are omitted. For example, it should be understood that the disclosed technology is not limited to any specific computer language or program. For instance, the disclosed technology can be implemented by software written in C++, Java, Perl, JavaScript, Adobe Flash, or any other suitable programming language. Likewise, the disclosed technology is not limited to any particular computer or type of hardware. Certain details of suitable computers and hardware are well known and need not be set forth in detail in this disclosure.
Furthermore, any of the software-based embodiments (comprising, for example, computer-executable instructions for causing a computer to perform any of the disclosed methods) can be uploaded, downloaded, or remotely accessed through a suitable communication means. Such suitable communication means include, for example, the Internet, the World Wide Web, an intranet, software applications, cable (including fiber optic cable), magnetic communications, electromagnetic communications (including RF, microwave, and infrared communications), electronic communications, or other such communication means.
The disclosed methods, apparatus, and systems should not be construed as limiting in any way. Instead, the present disclosure is directed toward all novel and nonobvious features and aspects of the various disclosed embodiments, alone and in various combinations and subcombinations with one another. The disclosed methods, apparatus, and systems are not limited to any specific aspect or feature or combination thereof, nor do the disclosed embodiments require that any one or more specific advantages be present or problems be solved.