Some displays enable a screen to be adjusted between, for example, a flat panel state and a curved or bent state. A user may wish to change the curvature of the display screen based on, for example, a type of content being presented on the screen.
In general, the same reference numbers will be used throughout the drawing(s) and accompanying written description to refer to the same or like parts. The figures are not necessarily to scale.
Some displays enable a screen to be adjusted between, for example, a flat panel state and a curved, bent, or flexed state. A user's viewing experience with respect to content presented on the display screen can change based on whether the screen is flat or curved and, when curved, the degree of curvature of the screen. A user may wish to change the degree of curvature of the display screen based on, for example, a type of content being presented on the screen and/or a distance of the user from the screen. For instance, a user may prefer the screen to have a high degree of curvature when the user is sitting close to the screen and substantially aligned with the center of the display screen while playing a video game. In particular, a curved screen can reduce distortion of content at the edges of the display screen as compared to if the screen is flat, thereby extending the user's peripheral vision, creating a larger perceived field of view by the user, and providing a more immersive viewing experience for the user while playing the video game. However, in other instances, the user may prefer less curvature of the screen, such as when the user is typing in a word processing document. Further, in other instances, the flat panel state of the display screen can provide for improved visibility (e.g., less distortion) over the curved state, such as when the user is watching a movie and sitting farther from the screen, when multiple people are viewing content on the screen at the same time and not everyone is aligned with a center of the screen, etc. Further, ambient lighting conditions can affect visibility when the display screen is curved, as curved screens may be more susceptible to reflections from light source(s) in the environment.
Some known flexible displays can be manually adjusted by the user to change the degree of curvature of the display screen. For instance, the user can adjust (e.g., bend or flatten) the screen by applying force via handles coupled to a housing of the display. Some known flexible displays provide for automated adjustment of the degree of curvature of the screen via, for instance, a remote control that controls an actuator (e.g., a motor) in response to a user input. However, adjusting the screen curvature manually or via a remote control input each time the type of content being presented changes, the ambient lighting changes, and/or, more generally, the user's viewing preference changes can be cumbersome for the user and/or place undue wear on the display.
Disclosed herein are example systems and methods that automatically and adaptively adjust curvature (e.g., a radius of an arc formed by the curved screen) of a display screen based on factors such as user position relative to the screen, content presented on the screen, ambient lighting, user preferences, etc. Some examples disclosed herein analyze outputs of sensor(s) (e.g., camera(s), microphone(s), light sensor(s), etc.) associated with the display and/or a compute device in communication with the display to determine a usage context for the display screen. The usage context can be indicative of, for example, a number of users viewing the screen, a distance of the user(s) from the screen, ambient lighting conditions, etc. Some examples disclosed herein analyze data indicative of, for example, application usage by the compute device, a type of content being presented, inputs received via peripheral input devices, etc. to determine a content context associated with the display screen. Examples disclosed herein determine a screen curvature state for the display screen, namely, whether the display screen should be in a flat state or a curved state to facilitate the viewing experience by the user(s) based on the parameter(s) indicative of the usage context and/or the content context. Examples disclosed herein can determine that the display screen can be placed in a flat state to reduce distortion of content when the user is watching a movie and farther from the screen. Examples disclosed herein can determine that the display screen should move from the flat state to a curved state to provide an increased perceived field of view with respect to the user's peripheral vision and, thus, a more immersive experience when the user is playing a video game and sitting closer to the display screen. In examples in which the screen is to be placed in the curved state, examples disclosed herein determine a curvature value for the screen, where curvature is defined as a radius of an arc formed by the curved screen and, thus, is referred to herein a curvature radius (e.g., 1800R, 2300R, 4000R, where “R” refers to radius measured in millimeters). In some examples, an orientation of the screen can be adjusted (e.g., the screen can be rotated about a vertical axis and/or a horizontal axis to adjust screen angle, and/or in some examples, the screen can be turned between a horizontal orientation and a vertical orientation). Examples disclosed herein can additionally or alternatively determine an orientation of the screen based on the data associated with the usage context and/or the content context.
Examples disclosed herein output instructions to cause the display to adjust (e.g., via one more actuators of the display) the screen curvature state (e.g., a flat state or a curved state, and a curvature radius when in the curved state) and/or the screen orientation based on analysis of the usage and/or content parameters. Examples disclosed herein monitor the usage and/or content parameter(s) over time to detect changes in user screen interactions, content presentation, ambient lighting conditions, etc. Based on the changes, examples disclosed herein determine whether adjustment(s) to the screen curvature and/or screen orientation are warranted and cause the display to implement any such adjustments. Thus, examples disclosed herein dynamically respond to changes associated with display screen use and/or operation by determining the effects of such changes on user viewing experience and automatically adjusting the curvature and/or orientation of the display screen.
The example compute device 102 includes display control circuitry 106 to cause digital content (e.g., graphical user interface(s), videos, electronic games) to be presented via a display screen 108 of the display 104. The display 104 includes panel electronics 110, or a circuit board including hardware such as a timing controller to provide for output of graphics via the display screen 108. In some examples, the display screen 108 is a touch screen and a user provides inputs to the compute device 102 via the display screen 108 using his or her fingers.
In some examples, one or more peripheral input devices 112 are communicatively coupled to the compute device 102 and a user provides inputs to the compute device 102 via the peripheral input device(s) 112. The peripheral input device(s) 112 can include a keyboard, a mouse, a trackpad, etc. In some examples, the peripheral input device(s) 112 include microphone(s) 113, etc. In some examples, the peripheral input device(s) 112 include image sensor(s) (e.g., camera(s)) 114. In some examples, the compute device 102 and/or the display 104 includes the microphone(s) 113 and/or the image sensor(s) 114, as illustrated in
The compute device 102 includes processor circuitry 111. The processor circuitry 111 executes machine-readable instructions (e.g., software) including, for example, an operating system 115 and/or other user application(s) 116 installed on the compute device 102, to interpret and output response(s) based on the user input event(s) (e.g., mouse input(s), keyboard input(s), touch event(s), etc.) provided by a user. The operating system 115 and the user application(s) 116 are stored in one or more storage devices 118. The compute device 102 of
The example compute device 102 of
In some examples, the image sensor(s) 114 of the compute device 102 and/or the display 104 serve as the user presence detection sensor(s) 124. For example, the image sensor(s) 114 can include a front-facing camera having a field of view facing a user who is interacting with the display screen 108. In some examples, the user presence detection sensor(s) 124 are part of the display 104 (e.g., when the display is separate from a housing of the compute device 102).
In some examples, the compute device 102 and/or the display 104 includes ambient light sensor(s) 126. In some examples, image data generated by the image sensor(s) 114 can be analyzed by, for example, the processor circuitry 111 of the compute device 102 to determine lighting conditions in the environment in which the display 104 is located. The compute device 102 and/or the display 104 can include other types of sensors, such as inertial measurement unit (IMU) sensors 142.
In the example of
The example display 104 of
In the example of
The flexible screen control circuitry 144 executes screen curvature logic (e.g., one or more rule(s), algorithm(s), neural network model(s), etc.) based on the usage parameter(s) and/or the content parameter(s) to determine (a) whether the display screen 108 should be in a flat state or a curved state and (b) if the curved state, a curvature radius (e.g., 1800R, 3000R, 4000R) for the display screen 108. For example, the flexible screen control circuitry 144 can determine that the screen 108 should move from the flat state to a first curved state corresponding to a first curvature radius when the flexible screen control circuitry 144 detects one user who is within a threshold distance of the display screen 108. In some examples, the threshold distance corresponds to a minimum degree of curvature of the display screen 108 (e.g., in examples where 4000R is the minimum curvature radius of the display screen 108, the threshold distance can correspond to 4000 mm).
In some examples, the flexible screen control circuitry 144 determines that the display screen 108 should move from the flat state or the first curved state to a second curved state having a second curvature radius that is smaller than the first curvature radius (i.e., such that the display screen 108 is more curved in the second curved state than the first curved state). For instance, when application usage data indicates that the user has switched from using a word processing document to playing a video game, the flexible screen control circuitry 144 can determine that the user's viewing experience may benefit from or be enhanced by increased screen curvature (e.g., to reduce distortion of content at the edges of the display screen 108, and increase a perceived field of view of the display screen 108 by the user by extending content captured within the user's peripheral vision to provide a more immersive viewing experience while playing the video game).
The flexible screen control circuitry 144 can determine that the display screen 108 should be in the flat state (i.e., unbent, not curved) when the user is at a distance that exceeds the threshold distance from the display screen 108. In such examples, the user will be able to view the content on the display screen 108 with less distortion when the display screen is in the flat state than if the display screen 108 was curved due to the distance of the user from the display screen 108 (e.g., because the curved portions may obstruct or distort the user's view when the user is outside the threshold distance from the display screen 108). In some examples, the flexible screen control circuitry 144 can determine that the screen 108 should be in the flat state whenever two or more users are detected, regardless of a distance of the users from the display screen 108 so that one or both users do not experience distortion of the content on the display screen 108.
The example flexible screen control circuitry 144 of
In some examples, the flexible screen control circuitry 144 determines an orientation of the display screen 108 based on the usage and/or content parameter(s). For example, based on the screen curvature logic, the flexible screen control circuitry 144 can cause the display screen 108 to rotate about a vertical axis and/or a horizontal axis to adjust an angle of the display scree 108 to, for example, reduce glare, align a center of the display screen with the position of the user, etc. As another example, based on the screen curvature logic, the flexible screen control circuitry 144 can determine that the display screen 108 should be in a vertical orientation when the content parameter(s) (e.g., operating system and/or application event data) indicate that a spreadsheet is being presented on the display screen 108. The flexible screen control circuitry 144 can determine that the display screen 108 should be in a horizontal orientation when the content parameter(s) indicates that a video is being presented on the display screen 108. The flexible screen control circuitry 144 outputs instructions to the pivot actuator control circuitry 140 of the display 104 to cause the pivot actuator(s) 138 to rotate the display screen about the screen orientation pivot 136 to the orientation determined by the flexible screen control circuitry 144. The flexible screen control circuitry 144 can adjust the screen orientation when the display screen 108 is in the flat state or the curved state.
The flexible screen control circuitry 144 monitors for changes with respect to the usage parameter(s) (e.g., user presence, user distance, ambient lighting) and/or content parameter(s) (e.g., application usage, user interaction(s)) over time to determine if changes to the screen curvature and/or orientation are warranted. If the flexible screen control circuitry 144 determines that such change(s) are warranted, the flexible screen control circuitry 144 outputs instructions to the arm actuator control circuitry 132 and/or the pivot actuator control circuitry 140 to cause corresponding adjustment(s) to the display screen 108.
Although in the example of
The example flexible screen control circuitry 144 of
The example user detection circuitry 200 analyzes outputs of the user presence detection sensor(s) 124 to determine whether one or more users are interacting with the compute device 102 and, thus, the display 104. In some examples, the user detection circuitry 200 analyzes image data generated by the image sensor(s) 114 of the compute device 102 and/or the display 104 to detect the presence and number of user(s) proximate to the display 104 based on, for example, facial recognition analysis. In some examples, the user detection circuitry 200 detects user presence based on changes in signals output by, for example, time-of-flight sensor(s). In some examples, the user detection circuitry 200 detects the presence of user(s) based on outputs of the microphone(s) 113 and audio (e.g., voice recognition) analysis.
In examples in which the user detection circuitry 200 detects the presence of one or more users relative to the compute device 102 and/or the display 104, the user detection circuitry 200 generates user positioning information including distance(s) of the user(s) from the display screen 108 and user position(s) relative to the display screen 108 (e.g., centered, more proximate to the right or left sides of the display screen, etc.). For example, the user detection circuitry 200 can use computer vision analysis to determine (e.g., estimate, predict) user distance from and/or user position relative to the display screen 108 based on the image data generated by the image sensor(s) 114. In some examples, the user detection circuitry 200 can estimate user distance from the display screen based on distance ranges associated with, for instance, time-of-flight sensors. The user detection circuitry 200 can compare the (e.g., predicted, estimated) distance(s) of the user(s) from the display screen 108 relative to a screen distance threshold value 211. The screen distance threshold value(s) 211 can be defined by user inputs and stored in a database 214. In some examples, the flexible screen control circuitry 144 includes the database 214. In some examples, the database 214 is located external to the flexible screen control circuitry 144 in a location accessible to the flexible screen control circuitry 144 as shown in
In some examples, the screen distance threshold value(s) 211 correspond to a minimum degree of curvature of the display screen 108 (e.g., the screen distance threshold value 211 can be 4000 mm when the minimum curvature of the display screen 108 is a curvature radius of 4000R). In some examples, the screen distance threshold value(s) 211 are defined based on other parameters, such as a largest distance of a user from the display screen 108 that does not affect or substantially affect a user's perceived field of view of the display screen 108 and/or result in perceived content distortion as determined by, for example, user testing As disclosed herein, in some examples, the screen adjustment determination circuitry 210 determines to place the display screen 108 in the flat state or the curved state based on whether the user is within (i.e., satisfies) the screen distance threshold 211 of the display screen 108 or exceeds the screen distance threshold 211 from the display screen 108.
In some examples, a user may create a user profile 212 that defines screen curvature state and/or screen orientation preferences for the user. The user profile 212 can include information such as an image of the user or password that associates the user with the user profile 212. The user profile 212 can be stored in the database 214. The user identification circuitry 202 of the example flexible screen control circuitry 144 of
The screen reflection estimation circuitry 204 of the example flexible screen control circuitry 144 of
The example content type analysis circuitry 206 determines or recognizes a type of content being presented on the display screen 108 at a given time based on data and/or events associated with the operating system 115 and/or the application(s) 116, graphics data and/or events associated with the display control circuitry 106, etc. The content type analysis circuitry 206 can determine a type of application being executed by the compute device 102. For example, the content type analysis circuitry 206 identifies whether a video game application or a movie streaming service application is being executed by the compute device 102 based on application event data. The content type analysis circuitry 206 determines the type of content being presented based on the data indicative of the application type, application usage event(s), etc. In some examples, the content type analysis circuitry 206 can analyze screenshots or image data representing data presented on the display screen 108 at a given time and recognizes the content type based on the image analysis.
The example user interaction analysis circuitry 208 determines an amount of user interaction with the compute device 102 based on inputs received via the peripheral input device(s) 112, event(s) associated with the operating system 115 and/or the application(s) 116, the IMU sensor(s) 142, etc. For example, the user interaction analysis circuitry 208 may determine that a user is regularly providing inputs to the compute device 102 via a keyboard or mouse, which can indicate that the user is typing in a word processing document or playing a video game and is likely positioned closer to the display screen 108. In some examples, the user interaction analysis circuitry 208 may determine that the user is only providing inputs at periodic intervals, which can indicate that the user is more likely to be watching a movie and is likely positioned farther from the display screen 108, etc. Thus, in some examples, the analysis by the user interaction analysis circuitry 208 can confirm, verify, or refine the analysis performed by the user detection circuitry 200 and/or the content type analysis circuitry 206.
The screen adjustment determination circuitry 210 of the example flexible screen control circuitry 144 of
In some examples, the screen adjustment determination circuitry 210 determines the screen curvature state (e.g., flat or curved, curvature radius when curved) and/or orientation based on (e.g., primarily based on) outputs of the user detection circuitry 200 with respect to whether user(s) are present relative to the display screen 108 and, if so, the distance and position of the user(s) relative to the display screen. Because causing the display screen 108 to move from the flat state to a curved state typically benefits (e.g., enhances) the user's viewing experience when the user is within a certain distance (e.g., the screen distance threshold value 211) of display screen 108 and substantially aligned with a center of the display screen 108 (to reduce distortion effects), the screen adjustment determination circuitry 210 can use the estimated distance of the user(s) from the display screen and/or user position relative to a center of the display screen 108 as the primary factor(s) (and in some instances, the only factor(s)) for determining whether to cause the display screen 108 to bend and/or to adjust the screen orientation. Further, determining the screen curvature state and/or orientation primarily (or only) based on user presence and positioning information (e.g., distance, position relative to the display screen 108) can conserve processing resources. However, in some examples, the screen adjustment determination circuitry 210 uses the usage, content, and/or interaction analysis performed by the screen reflection estimation circuitry 204, the content type analysis circuitry 206, and the user interaction analysis circuitry 208 to refine the determination of the curvature radius and/or to increase the fidelity of the decision-making process performed by the screen adjustment determination circuitry 210 with respect to determining whether the curvature and/or orientation of the display screen 108 should be adjusted.
In some examples, the screen adjustment determination circuitry 210 applies one or more screen adjustment rule(s) 216 to determine the curvature state (e.g., flat state or curved state and, if the curved state, then a curvature radius R) and/or the screen orientation for different usage parameter(s) (e.g., user position, distance) and/or content parameter(s) (e.g., application type, user interaction). The screen adjustment rule(s) 216 can include predefined screen curvature radius values and/or screen orientation values (e.g., horizontal, vertical) for various usage context and/or content context scenarios, display screen sizes, and supported maximum and minimum curvature radius values of the screen (e.g., 1500R-4000R), etc. For example, the screen adjustment determination circuitry 210 can select a rule 216 that applies (or most closely applies) based on a particular scenario defined by the number of user(s) present, the distance(s) of user(s) relative to display screen 108 and the screen distance threshold value 211, the position(s) of the user(s) relative to the display screen 108, the type of content presented on the display screen 108, etc. In examples in which the display screen 108 is to be placed in the curved state, the selected rule 216 can define the curvature radius R value and/or the screen orientation value (e.g., horizontal, vertical) for the combination of factor(s). The screen adjustment rule(s) 216 can be defined based on user input(s). After identifying a corresponding screen adjustment rule 216 based on the usage and/or context parameter(s), the screen adjustment determination circuitry 210 generates instructions including the curvature radius R value and/or the orientation value defined by the selected rule 216. The screen adjustment determination circuitry 210 outputs the instructions to the arm actuator control circuitry 132 and/or the pivot actuator control circuitry 140.
In some examples, the screen adjustment determination circuitry 210 executes one or more screen adjustment algorithms(s) and/or model(s) 218 to determine the screen curvature state and/or orientation for particular usage and/or content contexts associated with the display screen 108. For example, the screen adjustment algorithms(s) 218 can define that screen curvature is a linear function of screen size and factors such as user distance, number of users present, application type, ambient lighting, etc. The screen adjustment algorithms(s) 218 can define coefficient(s) or weight(s) (e.g., where the coefficient(s) and/or weight(s) are determined a priori) for adjusting the screen curvature radius based on the usage and/or content factors. For example, the screen adjustment algorithms(s) 218 can define that the screen curvature radius value should increase (so that the screen is less curved) when the father the user is from the screen distance threshold 211, when there is more than one user present, etc. The screen curvature coefficient(s) can be customized based on features of the display screen 108 such as size, minimum and maximum curvature radius values for the display screen 108, etc. The screen adjustment determination circuitry 210 can generate instructions defining the screen curvature state (flat or curved), the curvature radius R for the curved state, and/or the orientation based on the output(s) of the screen adjustment algorithms(s) 218 for given usage and/or content scenarios.
In some examples, the screen adjustment determination circuitry 210 executes one or more machine learning-based screen adjustment model(s) 218 to determine screen curvature state and/or orientation in view of the usage and/or content factor(s). For example, regression model(s) may be used to determine the screen curvature radius and/or orientation based on the parameter(s) identified by the user detection circuitry 200, the screen reflection estimation circuitry 204, the content type analysis circuitry 206, and/or the user interaction analysis circuitry 208. In some examples, the regression model(s) can include linear functions as discussed in connection with the screen adjustment algorithms(s) 218, where coefficient(s) for adjusting the screen curvature radius can be determined from training and/or calibration and updated over time based on machine learning feedback. For instance, if a user requests (e.g., via a user input) that the curvature radius of the display screen 108 be changed after the curvature radius determined by the screen adjustment determination circuitry 210 is implemented, then the user-preferred screen curvature can be used as feedback to train the machine learning model(s) 218. In some examples, the coefficients can be used in connection with a federated model that is updated by, for example, the display screen manufacturer and/or operating system.
In other examples, other types of neural network models, such as convolutional neural network (CNN) models may be used by the screen adjustment determination circuitry 210 to determine screen curvature state and/or orientation. For example, a CNN model can be trained based on a matrix that includes screen curvature radius values and/or orientation values corresponding to various combinations of number of users, user distance, application type, light reflectance likelihood, etc. The screen adjustment determination circuitry 210 generates instructions including the screen curvature radius values and/or orientation values to drive the respective actuator(s) 128, 138 to control the screen adjustment arm(s) 130 and/or the screen orientation pivot 136.
In some examples, based on the screen adjustment rule(s) 216 and/or the output(s) of the screen adjustment algorithms(s) and/or model(s) 218, the screen adjustment determination circuitry 210 may determine that the display screen 108 should be in the flat state in view of one or more usage and/or content factor(s) such as the user distance from the display screen 108, application type, ambient lighting, a number of user present, etc. In such examples, the screen adjustment determination circuitry 210 determines that causing the display screen 108 to, for example, move from the flat state to the curved state may not result in improved (e.g., optimal) visibility for the user(s) under the current parameters. Rather, placing the display screen 108 in the curved state could negatively impact the viewing experience by, for example, creating a risk of distortion of the content being presented (e.g., due to glare from ambient lighting). In such examples, the screen adjustment determination circuitry 210 outputs instructions to cause the actuator(s) 128 to move the display screen 108 to the flat state (or refrains from outputting instructions so that the display screen 108 remains in the flat state if already in the flat state).
As disclosed herein, in some examples, the display screen 108 can be rotated about a vertical axis and/or horizontal axis via the screen orientation pivot 136. In such examples, the screen adjustment determination circuitry 210 can evaluate whether the ability to rotate the display screen 108 affects the decision as to whether to place the display screen 108 to in a curved state or a flat state. For example, due to the effects of light reflection, the screen adjustment determination circuitry 210 may determine that the display screen 108 should move from a curved state to the flat state. However, because the display screen 108 can rotate about the screen orientation pivot 136, the screen adjustment determination circuitry 210 can revise the decision and instruct the display screen 108 to remain in the curved state but be rotated about the screen orientation pivot 136 to reduce the light reflections. Thus, in some examples, the ability of the display screen 108 to rotate can enable, for example, the display screen 108 to be in a curved state while avoiding the effects of light reflection rather than moving to the flat state.
As disclosed herein, in some examples, after the user detection circuitry 200 detects the presence of a user, the user identification circuitry 202 determines that the user is associated with a user profile 212 that defines user preference(s) for screen curvature state and/or screen orientation. In some such examples, rather than determining the screen curvature and/or orientation based on the usage context (e.g., user position, distance) and/or content context (e.g., application type, user interaction), the screen adjustment determination circuitry 210 generates instructions to cause the arm actuator(s) 128 (via the arm actuator control circuitry 132) to adjust the curvature radius and/or the pivot actuator(s) 138 (via the pivot actuator control circuitry 140) to adjust the orientation of the display screen 108 based on the user setting(s) in the user profile 212. Thus, in some examples, the setting(s) in the user profile(s) 212 override the screen adjustment rule(s) 216 and/or the output(s) that would be obtained via execution of the screen adjustment algorithm(s)/model(s) 218. In other examples, the user profile(s) 212 can be used as another factor in determining, for example, screen curvature coefficient(s) and/or weight(s) defined by the screen adjustment algorithms(s) 218, can be used to train the neural network model(s) 218, etc.
In the example of
Similarly, in
Table 1, below, sets forth example scenarios and combinations of numbers of users, application/content type and user interaction levels, likelihood of reflection from ambient light sources, whether or not a profile 212 exists for the user, and the recommended screen curvature state (e.g., flat state or curved state and, if the curved state, then a recommended curvature radius output) for the display screen 108 for a given scenario. In Table 1 below, curvature is defined as the radius of the arc formed by the curved screen. Also, the variable R represents a screen distance threshold value 211. In some examples, R corresponds to the minimum curvature radius of the display screen 108 (e.g., 4000R). In some examples, R correspond to a maximum distance threshold of a user from the display screen 108 that provides a field of view of the screen that has, for example, minimal distortion of content at the edges of the display screen as determined by, for instance, user testing. In Table 1, below, the identifier “All” in connection with application/content type and/or interactivity generally refers to applications such as audio player(s) and/or video player(s) for multimedia content, productivity/interactive applications such as a word processing application, a spreadsheet application, a web browser, interactive content such as text, web content, content that is consumed such as videos, movies, etc. The values K, J, Xp1, Xpn, Xv, and Xg in Table represent different curvature radius values.
The examples in Table 1 can define the screen adjustment rule(s) 216 and/or be used to implement the screen adjustment algorithms(s) and/or model(s) 218 (e.g., for training purposes, to determine weights or coefficients for adjusting curvature, etc.). In some examples, the screen adjustment determination circuitry 210 further adjusts or redefines the curvature output(s) defined in Table 1 based factors such as display size, amount of reflection, position(s) of user(s) with respect to center of the screen (e.g., closer to one of the edges of the display screen 108, substantially aligned with the center of the display screen 108, etc.), user preferences defined in the user profile(s) 212, etc. Also, although screen orientation is not defined in the examples of Table 1, in some examples, the screen adjustment determination circuitry 210 can determine an orientation for the display screen 108 for the respective scenarios or combinations of usage and/or content parameters.
As shown in Table 1, the screen adjustment determination circuitry 210 determines whether the display screen 108 should be in the flat state or a curved state and, if the curved state, the amount of curvature (i.e., curvature radius) based on parameters or inputs identified by one or more of the user detection circuitry 200, the user identification circuitry 202, the screen reflection estimation circuitry 204, the content type analysis circuitry 206, or the user interaction analysis circuitry 208. As shown in Table 1, when a user is within the distance threshold R of the display screen 108, the user may benefit in certain scenarios from the display screen 108 being in a curved state (e.g., Examples 1, 2, 5-10). Further, within those examples, the screen adjustment determination circuitry 210 may determine that different curvature radius values will provide an enhanced viewing experience based on, for example, whether or not there is light reflection from ambient light source, the type of context being viewed (e.g., video versus game, use of multiple applications versus a single application, etc.). In some examples, the curvature radius value is based on the distance (e.g., estimated distance) of the user from the display screen 108 (e.g., Example 1 in Table 1). In some examples, screen adjustment determination circuitry 210 may select the curvature radius value based on user distance and other factors, such as application type (e.g., Examples 7-10 in Table 1). In some examples, (e.g., Examples 1-3), the screen adjustment determination circuitry 210 applies the curvature radius value regardless of application or content type. In examples in which more than one viewer is viewing the display screen from a distance that exceeds the distance threshold R, the screen adjustment determination circuitry 210 may determine that the display screen 108 should be in the flat position because the flat position will provide a better viewing experience as compared to a curved screen (e.g., because bending the screen may appear to distort the content for the users outside the distance threshold R).
Although examples disclosed herein are primarily discussed in connection with, for instance, outputs of sensors from the compute device 102 and/or the display 104 in connection with determining screen curvature radius and/or screen orientation, in some examples, other sources can be used by the screen adjustment determination circuitry 210 to control the flexible display screen 108 in connection with, for example, an edge network. For example, the display 104 can be located in a conference room in which multiple users are viewing the display screen 108 of the display 104. In some such examples, the flexible screen control circuitry 144 can be implemented by, for instance, a server or edge device (instead of, for example, by the processor circuitry 111 of the compute device 102). In some such examples, the flexible screen control circuitry 144 can access outputs of, for example, ambient light sensors located in the conference room and the screen reflection estimation circuitry 204 can determine effects of lighting on the display screen 108 based on the sensor(s) in the room. In some such examples, the flexible screen control circuitry 144 can access data associated with other edges device(s), such as compute devices (e.g., tablets, laptops, smartphones) carried by users in the room and the content type analysis circuitry 206 can determine the type of content being presented on the display screen 108 based on the data from the other edge device(s). Thus, in some examples, the screen adjustment determination circuitry can use data from multiple sources including the compute device and other devices with respect to deciding the screen curvature state (i.e., flat or curved) and/or screen orientation.
In some examples, the flexible screen control circuitry 144 includes means for detecting a user. For example, the means for detecting may be implemented by the user detection circuitry 200. In some examples, the user detection circuitry 200 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
In some examples, the flexible screen control circuitry 144 includes means for identifying a user. For example, the means for identifying may be implemented by the user identification circuitry 202. In some examples, the user identification circuitry 202 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
In some examples, the flexible screen control circuitry 144 includes means for estimating screen reflection. For example, the means for estimating may be implemented by the screen reflection estimation circuitry 204. In some examples, the screen reflection estimation circuitry 204 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
In some examples, the flexible screen control circuitry 144 includes means for analyzing content type. For example, the means for content type analyzing may be implemented by the content type analysis circuitry 206. In some examples, the content type analysis circuitry 206 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
In some examples, the flexible screen control circuitry 144 includes means for analyzing user interaction. For example, the means for user interaction analyzing may be implemented by the user interaction analysis circuitry 208. In some examples, the user interaction analysis circuitry 208 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
In some examples, the flexible screen control circuitry 144 includes means for determining screen adjustment. For example, the means for determining may be implemented by the screen adjustment determination circuitry 210. In some examples, the screen adjustment determination circuitry 210 may be instantiated by programmable circuitry such as the example programmable circuitry 1112 of
While an example manner of implementing the flexible screen control circuitry 144 of
A flowchart representative of example machine-readable instructions, which may be executed by programmable circuitry to implement and/or instantiate the flexible screen control circuitry 144 of
The program may be embodied in instructions (e.g., software and/or firmware) stored on one or more non-transitory computer-readable and/or machine-readable storage medium such as cache memory, a magnetic-storage device or disk (e.g., a floppy disk, a Hard Disk Drive (HDD), etc.), an optical-storage device or disk (e.g., a Blu-ray disk, a Compact Disk (CD), a Digital Versatile Disk (DVD), etc.), a Redundant Array of Independent Disks (RAID), a register, ROM, a solid-state drive (SSD), SSD memory, non-volatile memory (e.g., electrically erasable programmable read-only memory (EEPROM), flash memory, etc.), volatile memory (e.g., Random Access Memory (RAM) of any type, etc.), and/or any other storage device or storage disk. The instructions of the non-transitory computer-readable and/or machine-readable medium may program and/or be executed by programmable circuitry located in one or more hardware devices, but the entire program and/or parts thereof could alternatively be executed and/or instantiated by one or more hardware devices other than the programmable circuitry and/or embodied in dedicated hardware. The machine-readable instructions may be distributed across multiple hardware devices and/or executed by two or more hardware devices (e.g., a server and a client hardware device). For example, the client hardware device may be implemented by an endpoint client hardware device (e.g., a hardware device associated with a human and/or machine user) or an intermediate client hardware device gateway (e.g., a radio access network (RAN)) that may facilitate communication between a server and an endpoint client hardware device. Similarly, the non-transitory computer-readable storage medium may include one or more mediums. Further, although the example program is described with reference to the flowchart illustrated in
The machine-readable instructions described herein may be stored in one or more of a compressed format, an encrypted format, a fragmented format, a compiled format, an executable format, a packaged format, etc. Machine-readable instructions as described herein may be stored as data (e.g., computer-readable data, machine-readable data, one or more bits (e.g., one or more computer-readable bits, one or more machine-readable bits, etc.), a bitstream (e.g., a computer-readable bitstream, a machine-readable bitstream, etc.), etc.) or a data structure (e.g., as portion(s) of instructions, code, representations of code, etc.) that may be utilized to create, manufacture, and/or produce machine executable instructions. For example, the machine-readable instructions may be fragmented and stored on one or more storage devices, disks and/or computing devices (e.g., servers) located at the same or different locations of a network or collection of networks (e.g., in the cloud, in edge devices, etc.). The machine-readable instructions may require one or more of installation, modification, adaptation, updating, combining, supplementing, configuring, decryption, decompression, unpacking, distribution, reassignment, compilation, etc., in order to make them directly readable, interpretable, and/or executable by a computing device and/or other machine. For example, the machine-readable instructions may be stored in multiple parts, which are individually compressed, encrypted, and/or stored on separate computing devices, wherein the parts when decrypted, decompressed, and/or combined form a set of computer-executable and/or machine executable instructions that implement one or more functions and/or operations that may together form a program such as that described herein.
In another example, the machine-readable instructions may be stored in a state in which they may be read by programmable circuitry, but require addition of a library (e.g., a dynamic link library (DLL)), a software development kit (SDK), an application programming interface (API), etc., in order to execute the machine-readable instructions on a particular computing device or other device. In another example, the machine-readable instructions may need to be configured (e.g., settings stored, data input, network addresses recorded, etc.) before the machine-readable instructions and/or the corresponding program(s) can be executed in whole or in part. Thus, machine-readable, computer-readable and/or machine-readable media, as used herein, may include instructions and/or program(s) regardless of the particular format or state of the machine-readable instructions and/or program(s).
The machine-readable instructions described herein can be represented by any past, present, or future instruction language, scripting language, programming language, etc. For example, the machine-readable instructions may be represented using any of the following languages: C, C++, Java, C#, Perl, Python, JavaScript, HyperText Markup Language (HTML), Structured Query Language (SQL), Swift, etc.
As mentioned above, the example operations of
In some examples, the user identification circuitry 202 can attempt to identify the user(s) based on, for example, facial recognition analysis using the image data generated by the image sensor(s) 114 (blocks 1008, 1010). In such examples, the user identification circuitry 202 can determine whether the user is associated with one of the user profile(s) 212 saved in the database 214, where the user profile(s) 212 can define user preferences with respect to screen curvature and/or orientation. If the user identification circuitry 202 determines that the user is associated with a user profile 212, the screen adjustment determination circuitry 210 generates and outputs screen adjustment instruction(s) based on the setting(s) defined in the user profile 212 for the recognized user (block 1022). In some examples, at block 1022, the screen adjustment determination circuitry 210 uses the user setting(s) in the user profile 212 to refine or adjust the instructions generated based on other factors such as a type of content being presented on the display screen 108.
In examples in which user identification is not performed or the user is not identified as having a user profile (block 1006, 1010), then at bock 1012, the screen adjustment determination circuitry 210 determines if the screen curvature state (e.g., flat state or curved state) of the display screen 108 should be determined based on other factor(s) or parameter(s) such as ambient lighting, application usage, user activity, etc. In some examples, the screen adjustment determination circuitry 210 determines whether the display screen 108 should be in the flat state or the curved state and the degree of curvature in the curved state based on the user positioning information (e.g., distance from the display screen 108 and position relative to the display screen 108) alone, without consideration of other factors such as ambient lighting or application event data (e.g., to converse processing resources) (block 1014). In this example, the screen adjustment determination circuitry 210 can determine whether the display screen 108 should be in the flat state or the curved state and, if in the curved state, the curvature radius by implementing the screen adjustment rule(s) 216 and/or the screen adjustment algorithm(s) and/or model(s) 218 for the number of users present, the distance(s) of the user(s) from the display screen 108 relative to the screen distance threshold value 211, and/or the position(s) of the user(s) relative to the display screen 108.
In some examples, the screen adjustment determination circuitry 210 determines whether the display screen 108 should be in the flat state or the curved state and the degree of curvature in the curved state based on outputs from one or more of the screen reflection estimation circuitry 204, the content type analysis circuitry 206, and the user interaction analysis circuitry 208 (block 1016). For example, the screen adjustment determination circuitry 210 can determine whether the display screen 108 should be in the flat state or the curved state and, if in the curved state, the degree of curvature by implementing the screen adjustment rule(s) 216 and/or the screen adjustment algorithm(s) and/or model(s) 218 based on factors such as number of users, user distance from the display screen 108, effects of light reflection, and type of content being presented on the display screen 108.
In examples in which the orientation of the display screen 108 is adjustable (e.g., via the screen orientation pivot 136), the screen adjustment determination circuitry 210 determines if the orientation of the display screen 108 should be adjusted (e.g., rotated about a horizontal axis and/or vertical axis extending through the screen orientation pivot 136, turned to a vertical or a horizontal orientation) (blocks 1018, 1020). In some examples, the screen adjustment determination circuitry 210 determines that the orientation of the display screen 108 should be adjusted to mitigate the effects of light reflections when the display screen 108 is in a curved state instead of causing the display screen 108 to move the flat state to reduce light reflections, as disclosed in connection with
At block 1022, the screen adjustment determination circuitry 210 generates and outputs instruction(s) regarding a screen curvature state of the display screen 108 (e.g., flat, curved), a degree of curvature of the display screen 108 in the curved state, and, in some examples, an orientation of the display screen 108. The screen adjustment determination circuitry 210 transmits the instructions to the arm actuator control circuitry 132 of the display 104 to cause the arm actuator(s) 128 to move the display screen 108 to the determined state and, if applicable, curvature radius, and/or to the pivot actuator control circuitry 140 to control the screen orientation based on the screen adjustment instruction(s).
The screen adjustment determination circuitry 210 monitors for changes in one or more screen adjustment parameter(s) (e.g., user presence, user distance, ambient light sources, content type) based on the outputs of one or more of the user detection circuitry 200, the user identification circuitry 202, the screen reflection estimation circuitry 204, and/or the content type analysis circuitry 206, the user interaction analysis circuitry 208 (block 1024). In examples in which change(s) are detected, the example instructions 1000 return to evaluating the screen curvature state and/or screen orientation decisions. The example instructions 1000 end at block 1026 with continued monitoring of the usage and/or content parameter(s) associated with the display screen 108.
The programmable circuitry platform 1100 of the illustrated example includes programmable circuitry 1112. The programmable circuitry 1112 of the illustrated example is hardware. For example, the programmable circuitry 1112 can be implemented by one or more integrated circuits, logic circuits, FPGAs, microprocessors, CPUs, GPUs, DSPs, and/or microcontrollers from any desired family or manufacturer. The programmable circuitry 1112 may be implemented by one or more semiconductor based (e.g., silicon based) devices. In this example, the programmable circuitry 1112 implements the user detection circuitry 200, the user identification circuitry 202, the screen reflection estimation circuitry 204, the content type analysis circuitry 206, the user interaction analysis circuitry 208, and the screen adjustment determination circuitry 210.
The programmable circuitry 1112 of the illustrated example includes a local memory 1113 (e.g., a cache, registers, etc.). The programmable circuitry 1112 of the illustrated example is in communication with main memory 1114, 1116, which includes a volatile memory 1114 and a non-volatile memory 1116, by a bus 1118. The volatile memory 1114 may be implemented by Synchronous Dynamic Random Access Memory (SDRAM), Dynamic Random Access Memory (DRAM), RAMBUS® Dynamic Random Access Memory (RDRAM®), and/or any other type of RAM device. The non-volatile memory 1116 may be implemented by flash memory and/or any other desired type of memory device. Access to the main memory 1114, 1116 of the illustrated example is controlled by a memory controller 1117. In some examples, the memory controller 1117 may be implemented by one or more integrated circuits, logic circuits, microcontrollers from any desired family or manufacturer, or any other type of circuitry to manage the flow of data going to and from the main memory 1114, 1116.
The programmable circuitry platform 1100 of the illustrated example also includes interface circuitry 1120. The interface circuitry 1120 may be implemented by hardware in accordance with any type of interface standard, such as an Ethernet interface, a universal serial bus (USB) interface, a Bluetooth® interface, a near field communication (NFC) interface, a Peripheral Component Interconnect (PCI) interface, and/or a Peripheral Component Interconnect Express (PCIe) interface.
In the illustrated example, one or more input devices 1122 are connected to the interface circuitry 1120. The input device(s) 1122 permit(s) a user (e.g., a human user, a machine user, etc.) to enter data and/or commands into the programmable circuitry 1112. The input device(s) 1122 can be implemented by, for example, an audio sensor, a microphone, a camera (still or video), a keyboard, a button, a mouse, a touchscreen, a trackpad, a trackball, an isopoint device, and/or a voice recognition system.
One or more output devices 1124 are also connected to the interface circuitry 1120 of the illustrated example. The output device(s) 1124 can be implemented, for example, by display devices (e.g., a light emitting diode (LED), an organic light emitting diode (OLED), a liquid crystal display (LCD), a cathode ray tube (CRT) display, an in-place switching (IPS) display, a touchscreen, etc.), a tactile output device, a printer, and/or speaker. The interface circuitry 1120 of the illustrated example, thus, typically includes a graphics driver card, a graphics driver chip, and/or graphics processor circuitry such as a GPU.
The interface circuitry 1120 of the illustrated example also includes a communication device such as a transmitter, a receiver, a transceiver, a modem, a residential gateway, a wireless access point, and/or a network interface to facilitate exchange of data with external machines (e.g., computing devices of any kind) by a network 1126. The communication can be by, for example, an Ethernet connection, a digital subscriber line (DSL) connection, a telephone line connection, a coaxial cable system, a satellite system, a beyond-line-of-sight wireless system, a line-of-sight wireless system, a cellular telephone system, an optical connection, etc.
The programmable circuitry platform 1100 of the illustrated example also includes one or more mass storage discs or devices 1128 to store firmware, software, and/or data. Examples of such mass storage discs or devices 1128 include magnetic storage devices (e.g., floppy disk, drives, HDDs, etc.), optical storage devices (e.g., Blu-ray disks, CDs, DVDs, etc.), RAID systems, and/or solid-state storage discs or devices such as flash memory devices and/or SSDs.
The machine-readable instructions 1132, which may be implemented by the machine-readable instructions of
The cores 1202 may communicate by a first example bus 1204. In some examples, the first bus 1204 may be implemented by a communication bus to effectuate communication associated with one(s) of the cores 1202. For example, the first bus 1204 may be implemented by at least one of an Inter-Integrated Circuit (I2C) bus, a Serial Peripheral Interface (SPI) bus, a PCI bus, or a PCIe bus. Additionally or alternatively, the first bus 1204 may be implemented by any other type of computing or electrical bus. The cores 1202 may obtain data, instructions, and/or signals from one or more external devices by example interface circuitry 1206. The cores 1202 may output data, instructions, and/or signals to the one or more external devices by the interface circuitry 1206. Although the cores 1202 of this example include example local memory 1220 (e.g., Level 1 (L1) cache that may be split into an L1 data cache and an L1 instruction cache), the microprocessor 1200 also includes example shared memory 1210 that may be shared by the cores (e.g., Level 2 (L2 cache)) for high-speed access to data and/or instructions. Data and/or instructions may be transferred (e.g., shared) by writing to and/or reading from the shared memory 1210. The local memory 1220 of each of the cores 1202 and the shared memory 1210 may be part of a hierarchy of storage devices including multiple levels of cache memory and the main memory (e.g., the main memory 1114, 1116 of
Each core 1202 may be referred to as a CPU, DSP, GPU, etc., or any other type of hardware circuitry. Each core 1202 includes control unit circuitry 1214, arithmetic and logic (AL) circuitry (sometimes referred to as an ALU) 1216, a plurality of registers 1218, the local memory 1220, and a second example bus 1222. Other structures may be present. For example, each core 1202 may include vector unit circuitry, single instruction multiple data (SIMD) unit circuitry, load/store unit (LSU) circuitry, branch/jump unit circuitry, floating-point unit (FPU) circuitry, etc. The control unit circuitry 1214 includes semiconductor-based circuits structured to control (e.g., coordinate) data movement within the corresponding core 1202. The AL circuitry 1216 includes semiconductor-based circuits structured to perform one or more mathematic and/or logic operations on the data within the corresponding core 1202. The AL circuitry 1216 of some examples performs integer based operations. In other examples, the AL circuitry 1216 also performs floating-point operations. In yet other examples, the AL circuitry 1216 may include first AL circuitry that performs integer-based operations and second AL circuitry that performs floating-point operations. In some examples, the AL circuitry 1216 may be referred to as an Arithmetic Logic Unit (ALU).
The registers 1218 are semiconductor-based structures to store data and/or instructions such as results of one or more of the operations performed by the AL circuitry 1216 of the corresponding core 1202. For example, the registers 1218 may include vector register(s), SIMD register(s), general-purpose register(s), flag register(s), segment register(s), machine-specific register(s), instruction pointer register(s), control register(s), debug register(s), memory management register(s), machine check register(s), etc. The registers 1218 may be arranged in a bank as shown in
Each core 1202 and/or, more generally, the microprocessor 1200 may include additional and/or alternate structures to those shown and described above. For example, one or more clock circuits, one or more power supplies, one or more power gates, one or more cache home agents (CHAs), one or more converged/common mesh stops (CMSs), one or more shifters (e.g., barrel shifter(s)) and/or other circuitry may be present. The microprocessor 1200 is a semiconductor device fabricated to include many transistors interconnected to implement the structures described above in one or more integrated circuits (ICs) contained in one or more packages.
The microprocessor 1200 may include and/or cooperate with one or more accelerators (e.g., acceleration circuitry, hardware accelerators, etc.). In some examples, accelerators are implemented by logic circuitry to perform certain tasks more quickly and/or efficiently than can be done by a general-purpose processor. Examples of accelerators include ASICs and FPGAs such as those discussed herein. A GPU, DSP and/or other programmable device can also be an accelerator. Accelerators may be on-board the microprocessor 1200, in the same chip package as the microprocessor 1200 and/or in one or more separate packages from the microprocessor 1200.
More specifically, in contrast to the microprocessor 1200 of
In the example of
In some examples, the binary file is compiled, generated, transformed, and/or otherwise output from a uniform software platform utilized to program FPGAs. For example, the uniform software platform may translate first instructions (e.g., code or a program) that correspond to one or more operations/functions in a high-level language (e.g., C, C++, Python, etc.) into second instructions that correspond to the one or more operations/functions in an HDL. In some such examples, the binary file is compiled, generated, and/or otherwise output from the uniform software platform based on the second instructions. In some examples, the FPGA circuitry 1300 of
The FPGA circuitry 1300 of
The FPGA circuitry 1300 also includes an array of example logic gate circuitry 1308, a plurality of example configurable interconnections 1310, and example storage circuitry 1312. The logic gate circuitry 1308 and the configurable interconnections 1310 are configurable to instantiate one or more operations/functions that may correspond to at least some of the machine-readable instructions of
The configurable interconnections 1310 of the illustrated example are conductive pathways, traces, vias, or the like that may include electrically controllable switches (e.g., transistors) whose state can be changed by programming (e.g., using an HDL instruction language) to activate or deactivate one or more connections between one or more of the logic gate circuitry 1308 to program desired logic circuits.
The storage circuitry 1312 of the illustrated example is structured to store result(s) of the one or more of the operations performed by corresponding logic gates. The storage circuitry 1312 may be implemented by registers or the like. In the illustrated example, the storage circuitry 1312 is distributed amongst the logic gate circuitry 1308 to facilitate access and increase execution speed.
The example FPGA circuitry 1300 of
Although
It should be understood that some or all of the circuitry of
In some examples, some or all of the circuitry of
In some examples, the programmable circuitry 1112 of
A block diagram illustrating an example software distribution platform 1405 to distribute software such as the example machine-readable instructions 1132 of
“Including” and “comprising” (and all forms and tenses thereof) are used herein to be open ended terms. Thus, whenever a claim employs any form of “include” or “comprise” (e.g., comprises, includes, comprising, including, having, etc.) as a preamble or within a claim recitation of any kind, it is to be understood that additional elements, terms, etc., may be present without falling outside the scope of the corresponding claim or recitation. As used herein, when the phrase “at least” is used as the transition term in, for example, a preamble of a claim, it is open-ended in the same manner as the term “comprising” and “including” are open ended. The term “and/or” when used, for example, in a form such as A, B, and/or C refers to any combination or subset of A, B, C such as (1) A alone, (2) B alone, (3) C alone, (4) A with B, (5) A with C, (6) B with C, or (7) A with B and with C. As used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, or (3) at least one A and at least one B. Similarly, as used herein in the context of describing structures, components, items, objects and/or things, the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, or (3) at least one A and at least one B. As used herein in the context of describing the performance or execution of processes, instructions, actions, activities, etc., the phrase “at least one of A and B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, or (3) at least one A and at least one B. Similarly, as used herein in the context of describing the performance or execution of processes, instructions, actions, activities, etc., the phrase “at least one of A or B” is intended to refer to implementations including any of (1) at least one A, (2) at least one B, or (3) at least one A and at least one B.
As used herein, singular references (e.g., “a,” “an,” “first,” “second,” etc.) do not exclude a plurality. The term “a” or “an” object, as used herein, refers to one or more of that object. The terms “a” (or “an”), “one or more,” and “at least one” are used interchangeably herein. Furthermore, although individually listed, a plurality of means, elements, or actions may be implemented by, e.g., the same entity or object. Additionally, although individual features may be included in different examples or claims, these may possibly be combined, and the inclusion in different examples or claims does not imply that a combination of features is not feasible and/or advantageous.
As used herein, unless otherwise stated, the term “above” describes the relationship of two parts relative to Earth. A first part is above a second part, if the second part has at least one part between Earth and the first part. Likewise, as used herein, a first part is “below” a second part when the first part is closer to the Earth than the second part. As noted above, a first part can be above or below a second part with one or more of: other parts therebetween, without other parts therebetween, with the first and second parts touching, or without the first and second parts being in direct contact with one another.
As used in this patent, stating that any part (e.g., a layer, film, area, region, or plate) is in any way on (e.g., positioned on, located on, disposed on, or formed on, etc.) another part, indicates that the referenced part is either in contact with the other part, or that the referenced part is above the other part with one or more intermediate part(s) located therebetween.
As used herein, connection references (e.g., attached, coupled, connected, and joined) may include intermediate members between the elements referenced by the connection reference and/or relative movement between those elements unless otherwise indicated. As such, connection references do not necessarily infer that two elements are directly connected and/or in fixed relation to each other. As used herein, stating that any part is in “contact” with another part is defined to mean that there is no intermediate part between the two parts.
Unless specifically stated otherwise, descriptors such as “first,” “second,” “third,” etc., are used herein without imputing or otherwise indicating any meaning of priority, physical order, arrangement in a list, and/or ordering in any way, but are merely used as labels and/or arbitrary names to distinguish elements for ease of understanding the disclosed examples. In some examples, the descriptor “first” may be used to refer to an element in the detailed description, while the same element may be referred to in a claim with a different descriptor such as “second” or “third.” In such instances, it should be understood that such descriptors are used merely for identifying those elements distinctly within the context of the discussion (e.g., within a claim) in which the elements might, for example, otherwise share a same name.
As used herein, the phrase “in communication,” including variations thereof, encompasses direct communication and/or indirect communication through one or more intermediary components, and does not require direct physical (e.g., wired) communication and/or constant communication, but rather additionally includes selective communication at periodic intervals, scheduled intervals, aperiodic intervals, and/or one-time events.
As used herein, “programmable circuitry” is defined to include (i) one or more special purpose electrical circuits (e.g., an application specific circuit (ASIC)) structured to perform specific operation(s) and including one or more semiconductor-based logic devices (e.g., electrical hardware implemented by one or more transistors), and/or (ii) one or more general purpose semiconductor-based electrical circuits programmable with instructions to perform specific functions(s) and/or operation(s) and including one or more semiconductor-based logic devices (e.g., electrical hardware implemented by one or more transistors). Examples of programmable circuitry include programmable microprocessors such as Central Processor Units (CPUs) that may execute first instructions to perform one or more operations and/or functions, Field Programmable Gate Arrays (FPGAs) that may be programmed with second instructions to cause configuration and/or structuring of the FPGAs to instantiate one or more operations and/or functions corresponding to the first instructions, Graphics Processor Units (GPUs) that may execute first instructions to perform one or more operations and/or functions, Digital Signal Processors (DSPs) that may execute first instructions to perform one or more operations and/or functions, XPUs, Network Processing Units (NPUs) one or more microcontrollers that may execute first instructions to perform one or more operations and/or functions and/or integrated circuits such as Application Specific Integrated Circuits (ASICs). For example, an XPU may be implemented by a heterogeneous computing system including multiple types of programmable circuitry (e.g., one or more FPGAs, one or more CPUs, one or more GPUs, one or more NPUs, one or more DSPs, etc., and/or any combination(s) thereof), and orchestration technology (e.g., application programming interface(s) (API(s)) that may assign computing task(s) to whichever one(s) of the multiple types of programmable circuitry is/are suited and available to perform the computing task(s).
As used herein integrated circuit/circuitry is defined as one or more semiconductor packages containing one or more circuit elements such as transistors, capacitors, inductors, resistors, current paths, diodes, etc. For example, an integrated circuit may be implemented as one or more of an ASIC, an FPGA, a chip, a microchip, programmable circuitry, a semiconductor substrate coupling multiple circuit elements, a system on chip (SoC), etc.
From the foregoing, it will be appreciated that example systems, apparatus, articles of manufacture, and methods have been disclosed that provide for intelligent and adaptive control of flexible displays with respect to screen curvature and/or orientation. Examples disclosed herein determine whether a screen should be in a flat state or a curved state based on a various factors such as a distance of a user from the display screen, a number of users viewing the display screen, ambient lighting conditions, a type of content being presented on the display screen, etc. Further, examples disclosed herein weigh the factor(s) or combination of factors to determine to the determine the degree of curvature or curvature radius when the screen is to be placed in a curved state. Examples disclosed herein provide for dynamic adjustments to screen curvature and/or orientation based on one or more parameters indicative of usage and/or content context in connection with the display screen.
Example systems and method for controlling flexible displays are disclosed herein. Further examples and combinations thereof include the following:
Example 2 includes the apparatus of example 1, wherein the outputs of the sensor correspond to image data and wherein one or more of the at least one processor circuit is to determine the distance of the user from the display screen based on the image data.
Example 3 includes the apparatus of examples 1 or 2, wherein the sensor is a first sensor and wherein one or more of the at least one processor circuit is to determine an ambient lighting condition in an environment including the display screen based on the outputs of the first sensor or outputs of a second sensor, the second sensor in communication with one or more of the at least one processor circuit; and determine the curvature radius based on the user distance and the ambient lighting condition.
Example 4 includes the apparatus of any of examples 1-3, wherein the actuator is a first actuator and wherein one or more of the at least one processor circuit is to cause a second actuator to adjust an orientation of the display screen based on the ambient lighting condition.
Example 5 includes the apparatus of any of examples 1-4, wherein the user is a first user and wherein one or more of the at least one processor circuit is to detect a presence of a second user relative to the display screen, the second user with the first user; determine a first position of the first user relative to the display screen; determine a second position of the second user relative to the display screen; and cause the display screen to move from a first state associated with the curvature radius to a second state based on the first position of the first user and the second position of the second user, the second state corresponding to a flat state of the display screen.
Example 6 includes the apparatus of any of examples 1-5, wherein the outputs of the sensor are associated with a first time and wherein one or more of the at least one processor circuit is to identify one or more an application type or an application usage event associated with execution of an application at a first time, the application installed on the apparatus; and determine the curvature radius based on the user distance and the one or more of the application type or the application usage event.
Example 7 includes the apparatus of any of examples 1-6, wherein the user is a first user, the outputs of the sensor are associated with a first time, and wherein one or more of the at least one processor circuit is to determine a position of a second user relative to the display screen at a second time after the first time; identify the second user as associated with a user profile, the user profile defining a first curvature radius for the display screen; and cause the actuator to adjust the curvature of the display screen based on the first curvature radius in the user profile.
Example 8 includes an electronic device comprising a display screen movable between a flat state and a curved state; a sensor; machine-readable instructions; and at least one processor circuit to at least one of instantiate or execute the machine-readable instructions to determine a distance of a user relative to the display screen based on outputs of the sensor; perform a comparison of the user distance to a screen distance threshold; when the user distance satisfies the screen distance threshold, cause the display screen to be in the curved state; and when the user distance exceeds the screen distance threshold, cause the display screen to be in the flat state.
Example 9 includes the electronic device of example 8, wherein the sensor is an image sensor.
Example 10 includes the electronic device of examples 8 or 9, wherein when the user distance satisfies the screen distance threshold, one or more of the at least one processor circuit is to determine a curvature radius of the display screen in the curved state; and cause the display screen to be in the curved state having the determined curvature radius.
Example 11 includes the electronic device of any of examples 8-10, wherein the outputs of the sensor are associated with a first time and one or more of the least one processor circuit is to identify a type of content presented on the display screen at the first time; and determine the curvature radius based on the type of content.
Example 12 includes the electronic device of any of examples 8-11, wherein the curvature radius is a first curvature radius and wherein one or more of the least one processor circuit is to identify a position of the user relative to the display screen; identify a light source in an environment including the electronic device; determine a path of light emitted by the light source and reflected by the display screen relative to the position of the user when the display screen has the first curvature radius; and cause the display screen to adjust the first curvature radius to a second curvature radius based on the path of the light.
Example 13 includes the electronic device of any of examples 8-12, wherein one or more of the at least one processor circuit is to cause an orientation of the display screen to be adjusted when the display screen is in the curved state or the flat state.
Example 14 includes the electronic device of any of examples 8-13, wherein one or more of the at least one processor circuit is to identify the user as associated with a user profile, the user profile defining a first curvature radius for the curved state of the display screen; and cause the display screen to have the first curvature radius when the display screen is in the curved state.
Example 15 includes the electronic device of any of examples 8-14, wherein the user is a first user, the distance satisfies the screen distance threshold, the display screen is in the curved state a first time, and wherein one or more of the at least one processor circuit is to detect a presence of a second user relative to the display screen with the first user at a second time after the first time; and responsive to the detection of the second user, cause the display screen to move from the curved state to the flat state.
Example 16 includes the electronic device of any of examples 8-15, wherein the screen distance threshold corresponds to a curvature radius associated with a minimum curvature of the display screen.
Example 17 includes a non-transitory machine-readable storage medium comprising instructions to cause at least one processor circuit to at least detect a presence of a user relative to a display screen; determine a distance of the user from the display screen; and cause the display screen to move (a) from a flat state to a curved state or (b) from the curved state to the flat state based on the distance of the user from the display screen.
Example 18 includes the non-transitory machine-readable storage medium of example 17, wherein the machine-readable instructions are to cause one or more of the at least one processor circuit to cause the display screen to move from the flat state to the curved state when the distance of the user from the display screen is within a threshold distance.
Example 19 includes the non-transitory machine-readable storage medium of examples 17 or 18, wherein the machine-readable instructions are to cause one or more of the at least one processor circuit to recognize content presented on the display screen; and cause the display screen to move from the flat state to the curved state based on the threshold distance and the content.
Example 20 includes the non-transitory machine-readable storage medium of any of examples 17-19, wherein the user is a first user, the display screen is in the flat state at a first time, and wherein the machine-readable instructions are to cause one or more of the at least one processor circuit to detect a presence of second user relative to the display screen with the first user at a second time after the first time; and responsive to the detection of the second user, cause the display screen to move from the curved state to the flat state.
Example 21 includes the non-transitory machine-readable storage medium of any of examples 17-20, wherein the machine-readable instructions are to cause one or more of the at least one processor circuit to determine the distance of the user from the display screen based on image data.
Example 22 includes an electronic device comprising a display screen having a flat state and a curved state; an actuator to cause to the display screen to move between the flat state and the curved state; a sensor; means for detecting a presence of a user, the means for detecting to determine a distance of the user relative to the display screen based on outputs of the sensor; and means for determining a screen adjustment, the means for determining to determine a curvature radius for the display screen in the curved state based on the user distance and a type of content presented by the display screen; and cause the actuator to move display screen to the curved state having the curvature radius.
Example 23 includes the electronic device of example 22, further including means for analyzing content type, the means for analyzing content type to determine the type of content based on application event data for an application installed on the electronic device.
Example 24 includes the electronic device of examples 22 or 23, further including means for analyzing user interaction, the means for analyzing user interaction to detect a user input at the electronic device and verify the type of content based on the user input.
Example 25 includes the electronic device of any of examples 22-24, further including means for estimating screen reflectance, the means for estimating screen reflectance to determine a likelihood of reflectance of light by the display screen based on a location of a light source in an environment including the electronic device; and adjust the curvature radius based on the likelihood of the reflectance of the light to generate an adjusted curvature radius, the means for determining to cause the actuator to move the display screen to the curved state having the adjusted curvature radius.
The following claims are hereby incorporated into this Detailed Description by this reference. Although certain example systems, apparatus, articles of manufacture, and methods have been disclosed herein, the scope of coverage of this patent is not limited thereto. On the contrary, this patent covers all systems, apparatus, articles of manufacture, and methods fairly falling within the scope of the claims of this patent.