The present disclosure relates generally to camera devices, and specifically relates to power saving mechanisms for camera devices.
In the most typical use case of a camera device, the camera device is facing forward with its lens in horizontal posture in optical series with an image sensor. Due to gravity, the lens as part a lens-shift design or the image sensor as part of a sensor-shift design would sag, and, therefore, the lens would not be in correct position relative to the image sensor. Additional power needs to be consumed to bring the lens or the image sensor in correct position relative to each other, which increases an overall power consumption of the camera device.
Optical image stabilization applied at the camera device requires tradeoff between power consumption and performance of the camera device. More camera stroke results in better performance for the camera device. However, power usage increases especially when compensating gravity sag. Letting lens of the camera device fully sag can save power but may not leave sufficient camera stroke, which negatively affects performance of the camera device especially for longer exposures of the camera device.
Embodiments of the present disclosure relate to a power saving mechanism for a camera device (e.g., wearable camera device) by having an image sensor of the camera device biased to one side relative to a lens assembly of the camera device. The lens assembly is in an optical series with the image sensor. At a first orientation of the camera device (e.g., upward or vertical posture of the camera device), there is an offset between a center axis of the image sensor and an optical axis of the lens assembly. At a second orientation of the camera device (e.g., forward or horizontal posture of the camera device), at least one of the image sensor and the lens assembly sag due to gravity such that the center axis and the optical axis substantially overlap while the camera device is in a neutral state.
Embodiments of the present disclosure further relate to a power saving mechanism for a camera device (e.g., wearable camera device) based on a dynamic sag compensation. The camera device includes an image sensor and a lens assembly in an optical series with the image sensor. The lens assembly and the image sensor are configured to allow a dynamic amount of sag relative to one another.
The camera device presented herein may be part of a wristband system, e.g., a smartwatch or some other electronic wearable device. Additionally or alternatively, the camera device may be part of a handheld electronic device (e.g., smartphone) or some other portable electronic device.
The figures depict various embodiments for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Embodiments of the present disclosure relate to a camera device (e.g., wearable camera device) with an optical image stabilization (OIS) assembly and an autofocus assembly. While the camera device is oriented vertically (i.e., an optical axis is perpendicular to the ground), a lens assembly of the camera device is in an offset position relative to an image sensor of the camera device. While the camera device is tilted sideways (i.e., the optical axis is parallel to the ground), the lens assembly and/or the OIS assembly sag (e.g., due to gravity) such that the lens assembly is correctly positioned relative to the image sensor. In some embodiments, the camera device accounts for sag of the lens assembly relative to the image sensor such that an amount of allowed sag is dynamically controlled based in part on motion, or predicted motion of the camera device, an exposure time of the camera device, or some combination thereof.
The camera device may be incorporated into a small form factor electronic device, such as an electronic wearable device. Examples of electronic wearable devices include a smartwatch or a head-mount display (HMD). The electronic device can include other components (e.g., haptic devices, speakers, etc.). And, the small form factor of the electronic device provides limited space between the other components and the camera device. In some embodiments, the electronic device may have limited power supply (e.g., due to being dependent on a re-chargeable battery).
In some embodiments, the electronic wearable device may operate in an artificial reality environment (e.g., a virtual reality environment). The camera device of the electronic wearable device may be used to enhance an artificial reality application running on an artificial reality system (e.g., running on an HMD device worn by the user). The camera device may be disposed on multiple surfaces of the electronic wearable device such that data from a local area, e.g., surrounding a wrist of the user, may be captured in multiple directions. For example, one or more images may be captured describing the local area and the images may be sent and processed by the UND device prior to be presented to the user.
Embodiments of the present disclosure may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer). Additionally, in some embodiments, artificial reality may also be associated with applications, products, accessories, services, or some combination thereof, that are used to create content in an artificial reality and/or are otherwise used in an artificial reality. The artificial reality system that provides the artificial reality content may be implemented on various platforms, including an electronic wearable device (e.g., headset) connected to a host computer system, a standalone electronic wearable device (e.g., headset, smartwatch, bracelet, etc.), a mobile device or computing system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
In some examples, the wristband system 100 may include multiple electronic devices (not shown) including, without limitation, a smartphone, a server, a head-mounted display (HMD), a laptop computer, a desktop computer, a gaming system, Internet of things devices, etc. Such electronic devices may communicate with the wristband system 100 (e.g., via a personal area network). The wristband system 100 may have sufficient processing capabilities (e.g., CPU, memory, bandwidth, battery power, etc.) to offload computing tasks from each of the multiple electronic devices to the wristband system 100. Additionally, or alternatively, each of the multiple electronic devices may have sufficient processing capabilities (e.g., CPU, memory, bandwidth, battery power, etc.) to offload computing tasks from the wristband system 100 to the electronic device(s).
The wristband system 100 includes a watch body 104 coupled to a watch band 112 via one or more coupling mechanisms 106, 110. The watch body 104 may include, among other components, one or more coupling mechanisms 106, one or more camera devices 115 (e.g., camera device 115A and 115B), the display screen 102, a button 108, a connector 118, a speaker 117, and a microphone 121. The watch band 112 may include, among other components, one or more coupling mechanisms 110, a retaining mechanism 113, one or more sensors 114, the haptic device 116, and a connector 120. While
The watch body 104 and the watch band 112 may have any size and/or shape that is configured to allow a user to wear the wristband system 100 on a body part (e.g., a wrist). The wristband system 100 may include the retaining mechanism 113 (e.g., a buckle) for securing the watch band 112 to the wrist of the user. The coupling mechanism 106 of the watch body 104 and the coupling mechanism 110 of the watch band 112 may attach the watch body 104 to the watch band 112. For example, the coupling mechanism 106 may couple with the coupling mechanism 110 by sticking to, attaching to, fastening to, affixing to, some other suitable means for coupling to, or some combination thereof.
The wristband system 100 may perform various functions associated with the user. The functions may be executed independently in the watch body 104, independently in the watch band 112, and/or in communication between the watch body 104 and the watch band 112. In some embodiments, a user may select a function by interacting with the button 108 (e.g., by pushing, turning, etc.). In some embodiments, a user may select a function by interacting with the display screen 102. For example, the display screen 102 is a touchscreen and the user may select a particular function by touching the display screen 102. The functions executed by the wristband system 100 may include, without limitation, displaying visual content to the user (e.g., displaying visual content on the display screen 102), presenting audio content to the user (e.g., presenting audio content via the speaker 117), sensing user input (e.g., sensing a touch of button 108, sensing biometric data with the one or more sensors 114, sensing neuromuscular signals with the one or more sensors 114, etc.), capturing audio content (e.g., capturing audio with microphone 121), capturing data describing a local area (e.g., with a front-facing camera device 115A and/or a rear-facing camera device 115B), communicating wirelessly (e.g., via cellular, near field, Wi-Fi, personal area network, etc.), communicating via wire (e.g., via the port), determining location (e.g., sensing position data with a sensor 114), determining a change in position (e.g., sensing change(s) in position with an IMU), determining an orientation and/or acceleration (e.g., sensing orientation and/or acceleration data with an IMU), providing haptic feedback (e.g., with the haptic device 116), etc.
The display screen 102 may display visual content to the user. The displayed visual content may be oriented to the eye gaze of the user such that the content is easily viewed by the user. Traditional displays on wristband systems may orient the visual content in a static manner such that when a user moves or rotates the wristband system, the content may remain in the same position relative to the wristband system causing difficulty for the user to view the content. The displayed visual content may be oriented (e.g., rotated, flipped, stretched, etc.) such that the displayed content remains in substantially the same orientation relative to the eye gaze of the user (e.g., the direction in which the user is looking). The displayed visual content may also be modified based on the eye gaze of the user. For example, in order to reduce the power consumption of the wristband system 100, the display screen 102 may dim the brightness of the displayed visual content, pause the displaying of visual content, or power down the display screen 102 when it is determined that the user is not looking at the display screen 102. In some examples, one or more sensors 114 of the wristband system 100 may determine an orientation of the display screen 102 relative to an eye gaze direction of the user.
The position, orientation, and/or motion of eyes of the user may be measured in a variety of ways, including through the use of optical-based eye-tracking techniques, infrared-based eye-tracking techniques, etc. For example, the front-facing camera device 115A and/or rear-facing camera device 115B may capture data (e.g., visible light, infrared light, etc.) of the local area surrounding the wristband system 100 including the eyes of the user. The captured data may be processed by a controller (not shown) internal to the wristband system 100, a controller external to and in communication with the wristband system 100 (e.g., a controller of an HMD), or a combination thereof to determine the eye gaze direction of the user. The display screen 102 may receive the determined eye gaze direction and orient the displayed content based on the eye gaze direction of the user.
In some embodiments, the watch body 104 may be communicatively coupled to an HMD. The front-facing camera device 115A and/or the rear-facing camera device 115B may capture data describing the local area, such as one or more wide-angle images of the local area surrounding the front-facing camera device 115A and/or the rear-facing camera device 115B. The wide-angle images may include hemispherical images (e.g., at least hemispherical, substantially spherical, etc.), 180-degree images, 360-degree area images, panoramic images, ultra-wide area images, or a combination thereof. In some examples, the front-facing camera device 115A and/or the rear-facing camera device 115B may be configured to capture images having a range between 45 degrees and 360 degrees. The captured data may be communicated to the HMD and displayed to the user on a display screen of the HMD worn by the user. In some examples, the captured data may be displayed to the user in conjunction with an artificial reality application. In some embodiments, images captured by the front-facing camera device 115A and/or the rear-facing camera device 115B may be processed before being displayed on the HMD. For example, certain features and/or objects (e.g., people, faces, devices, backgrounds, etc.) of the captured data may be subtracted, added, and/or enhanced before displaying on the HMD.
Components of the front-facing camera device 115A and the rear-facing camera device 115B may be capable of taking pictures capturing data describing the local area. A lens of the front-facing camera device 115A and/or a lens of the rear-facing camera device 115B can be automatically positioned at their target positions. A target position in a forward (or horizontal) posture of the front-facing camera device 115A may correspond to a position at which the lens of the front-facing camera device 115A is focused at its preferred focal distance (e.g., distance in the order of several decimeters). A target position in a forward (or horizontal) posture of the rear-facing camera device 115B may correspond to a position at which the lens of the rear-facing camera device 115B is focused at its hyperfocal distance in the local area (e.g., a distance of approximately 1.7 meter).
While the front-facing camera device 115A and/or the rear-facing camera device 115B are oriented vertically (i.e., an optical axis is perpendicular to the ground), their lens assembly may be in an offset position relative to an image sensor of the respective front-facing camera device 115A and the rear-facing camera device 115B. Because of this offset position, while the front-facing camera device 115A and/or the rear-facing camera device 115B is tilted sideways (i.e., the optical axis is parallel to the ground), their lens assembly and/or OIS assembly sag (e.g., due to gravity) such that the lens assembly is correctly positioned relative to the image sensor. Details about this mechanism are provided below in relation to
In some embodiments, components of the camera device 215 are positioned within the camera device 215 such that when the camera device 215 is at the upward (vertical) posture, there is an offset between an optical axis of a lens and a center axis of a sensor of the camera device 215. And when the camera device 215 is at the forward (horizontal) posture to take a picture capturing data describing a local area, the lens and the sensor sag due to gravity such that the optical axis of the lens and the center axes of the sensor substantially overlap, while the camera device 215 is in a neutral state. In this manner, power is saved that would be otherwise consumed to bring the lens and sensor in correct positions relative to one another. Details about this power saving mechanism are provided below in relation to
The camera device 305 may capture data (e.g., one or more images) of a local area surrounding the electronic wearable device 300. The camera device 305 may be an embodiment of the camera devices 115, 215. Details about a structure and operation of the camera device 305 are provided below in relation to
The display device 310 may display visual content to the user on a display screen of the display device 310. Additionally, the display device 310 may present audio content to the user, sense user input, capture audio content, capturing data describing a local area (e.g., with the camera device 305), communicate wirelessly, communicate via wire, determine location, determine a change in position, determining an orientation and/or acceleration, providing haptic feedback, and/or provide some other function. The display screen of the display device 310 may be an embodiment of the display screen 102 or the display screen 202.
The controller 315 may control operations of the camera device 305, the display device 310 and/or some other component(s) of the electronic wearable device 300. The controller 315 may control OIS, autofocusing, actuation, some other operation applied at the camera device 305, or some combination thereof. The controller 315 may also process data captured by the camera device 305. Furthermore, the controller 315 may control any aforementioned functions of the display device 310. In some embodiments, the controller 315 is part of the camera device 305
The PCB 320 is a stationary component of the electronic wearable device 300 and provides mechanical support (e.g., by acting as a base) for the electronic wearable device 300. The PCB 320 may provide electrical connections for the camera device 305, the display device 310 and the controller 315. The PCB 320 may also electrically connect the controller 315 to the camera device 305 and the display device 310.
The camera device 305 is configured to have both a focusing assembly and a stabilization assembly. The focusing assembly is configured to cause a translation of the lens barrel 405 in a direction parallel to an optical axis 402 of the lens assembly 410. The focusing assembly provides an auto focus functionality for the camera device 305. The focusing assembly includes the one or more restoring auto focusing springs 420, the one or more OIS suspension wires 423, and a plurality of magnets included in the magnetic assembly 440. The stabilization assembly is configured to cause a translation of the lens barrel 405 (and, in some embodiments, the magnetic assembly 440 and the lens barrel 405) in one or more directions perpendicular to the optical axis 402. The stabilization assembly provides an OIS functionality for the camera device 305 by stabilizing an image projected through the lens barrel 405 to the image sensor 455. The stabilization assembly includes the lens barrel 405, the shield case 415, and the magnetic assembly 440.
The lens barrel 405 is a mechanical structure or housing for carrying one or more lenses of the lens assembly 410. The lens barrel 405 is a hollow structure with an opening on opposite ends of the lens barrel 405. The openings may provide a path for light (e.g., visible light, infrared light, etc.) to transmit between a local area and the image sensor 455. Inside the lens barrel 405, one or more lenses of the lens assembly 410 are positioned between the two openings. The lens barrel 405 may be manufactured from a wide variety of materials ranging from plastic to metals. In some embodiments, one or more exterior surfaces of the lens barrel 305 are coated with a polymer (e.g., a sub-micron thick polymer). The lens barrel 405 may be rotationally symmetric about the optical axis 402 of the one or more lenses of the lens assembly 310.
The lens barrel 405 may be coupled to the magnetic assembly 440 by the one or more restoring auto focusing springs 420. For example, the one or more restoring auto focusing springs 420 are coupled to the lens barrel 405 and the magnetic assembly 440. In some embodiments, the magnetic assembly 440 is coupled to the shield case 415. In another example (not illustrated), the one or more restoring auto focusing springs 420 are coupled to the shield case 415 directly and the lens barrel 405. The one or more restoring auto focusing springs 420 are configured to control a positioning of the lens barrel 405 along the optical axis 402. For example, the plurality of restoring auto focusing springs 420 may control the positioning of the lens barrel 405 such that when current is not supplied to the one or more auto focusing coils 435 the lens barrel 405 is in a neutral position. In some embodiments, the one or more restoring auto focusing springs 420 may be shape-memory alloy (SMA) wires. The neutral position of the lens barrel 405 is a positioning of the lens barrel 405 when the camera device 305 is not undergoing focusing (via the focusing assembly) nor stabilizing (via the stabilization assembly). The one or more restoring auto focusing springs 420 can ensure the lens barrel 405 does not fall out or come into contact with the image sensor 455. In some embodiments, the one or more restoring auto focusing springs 420 are conductors and may be coupled to the one or more auto focusing coils 435. In these embodiments, the plurality of restoring auto focusing springs 420 may be used to provide current to the one or more auto focusing coils 435. The one or more restoring auto focusing springs 420 may be coupled to the one or more OIS suspension wires 423 that provide current to the one or more restoring auto focusing springs 420 so that the one or more restoring auto focusing springs 420 can facilitate auto focusing of the lens assembly 410. The one or more OIS suspension wires 423 may be positioned symmetrically about the optical axis 402.
The shield case 415 may enclose some of the components of the camera device 305 as illustrated in
The carrier 425 is directly coupled to the lens barrel 405. For example, the carrier 425 comprises a first side in direct contact with a surface of the lens barrel 405 and a second side opposite the first side. In some embodiments, the carrier 425 is coupled to the lens barrel 405 by an adhesive. The one or more auto focusing coils 435 may be affixed to the second side of the carrier 425. The carrier 425 has a curvature that conforms to the curvature of the lens barrel 405. In some embodiments, more than one carrier 425 may be directly coupled to the lens barrel 405. In these embodiments, the number of carriers 425 may match a number of auto focusing coils 435 and the carriers 425 may be positioned at unique locations around the lens barrel 405 such that a carrier 425 is positioned between a corresponding auto focusing coil 435 and the lens barrel 405. In some embodiments, the restoring auto focusing springs 420 may be coupled to the carrier 425.
The one or more auto focusing coils 435 are configured to conduct electricity by being supplied with a current. The one or more auto focusing coils 435 may be positioned symmetrically about the optical axis 402. For example, the one or more auto focusing coils 435 may consist of two individual coils positioned symmetrically about the optical axis 402, as illustrated in
The one or more actuators 430 are configured to provide auto focusing to the one or more lenses of the lens assembly 410. The one or more actuators 430 consume an auto focusing actuation power while providing auto focusing to the one or more lenses of the lens assembly 410. To reduce (and in some cases minimize) a level of the auto focusing actuation power consumption (e.g., to achieve the zero level auto focusing actuation power), relative positions of the lens assembly 410, the carrier 425 and the one or more actuators 430 along the optical axis 402 may be controlled during assembling of the camera device 305.
The magnetic assembly 440 includes a magnet holder for holding a plurality of magnets. The magnet holder may provide a rigid structure to support the plurality of magnets. In some embodiments, the magnet holder may enclose all sides of the magnets. In other embodiments, the magnet holder may enclose all sides of the magnets except for a side facing the one or more auto focusing coils 435. In some embodiments, one or more exterior surfaces of the magnetic assembly 440 are coated with a polymer similar to the lens barrel 305 described above.
The plurality of magnets of the magnetic assembly 440 generate magnetic fields that can be used for translating the lens barrel 405 along the optical axis 402 (e.g., focusing the camera device 305) and/or perpendicular to the optical axis 402 (e.g., providing OIS for the camera device 305). The magnetic fields used for focusing the camera device 305 can be applied in the forward (horizontal) posture of the camera device 305, e.g., to focus the lens assembly 410 at the hyperfocal distance.
Each magnet of the plurality of magnets may be a different size or the same size. In some embodiments, each magnet is curved about the optical axis 402 conforming to the curvature of the one or more auto focusing coils 435 and the lens barrel 405. In some embodiments, each magnet is straight. For example, at least two opposing sides of each magnet are parallel to a plane that is parallel to the optical axis 402. Each magnet of the plurality of magnets may include rectangular cross sections with one axis of a cross section being parallel to the optical axis 402 and another axis of the cross section being perpendicular to the optical axis 402. In some embodiments, each magnet may include other types of cross-sectional shapes such as square or any other shape that includes at least one straight-edged side that faces the one or more auto focusing coils 435. Each magnet is a permanent magnet that is radially magnetized with respect to the optical axis 402. The magnets may be positioned symmetrically about the optical axis 402.
The image sensor 455 captures data (e.g., one or more images) describing a local area. The image sensor 455 may include one or more individual sensors, e.g., a photodetector, a CMOS sensor, a CCD sensor, some other device for detecting light, or some combination thereof. The individual sensors may be in an array. For a camera device 305 integrated into an electronic device, the local area is an area surrounding the electronic device. The image sensor 455 captures light from the local area. The image sensor 455 may capture visible light and/or infrared light from the local area surrounding the electronic device. The visible and/or infrared light is focused from the local area to the image sensor 455 via the lens barrel 405. The image sensor 455 may include various filters, such as the IRCF 445. The IRCF 445 is a filter configured to block the infrared light from the local area and propagate the visible light to the image sensor 455. The IRCF 445 may be placed within the IRCF holder 450.
At the upward (vertical) posture of the camera device 305 shown in
The PCB 460 is positioned below the image sensor 455 along the optical axis 402. The PCB 460 is a stationary component of the camera device 305 and provides mechanical support (e.g., by acting as a base) for the camera device 305. The PCB 460 may provide electrical connections for one or more components of the camera device 305. In some embodiments, a controller may be located on the PCB 460 and the PCB 460 electrically connects the controller to various components (e.g., the one or more auto focusing coils 435) of the camera device 305. In other embodiments (as shown in
While the camera device 305 is at the forward posture (as shown in
Embodiments of the present disclosure further relate to a power saving approach for the camera device 305 based on a dynamic sag compensation. The lens assembly 410 and the image sensor 455 may allow a dynamic amount of sag relative to one another. The dynamic amount of sag may be based on information from, e.g., an OIS assembly of the camera device 305. The dynamic amount of sag may be a function of at least one of an exposure duration of the camera device 305 and a change in position of the camera device 305 in one or more spatial directions. In one embodiment, the dynamic amount of sag decreases when the exposure duration of the camera device 305 is longer than a threshold duration. In another embodiment, the dynamic amount of sag decreases when the change in position of the camera device 305 is greater than a threshold change along at least one spatial direction. In yet another embodiment, the dynamic amount of sag increases when the exposure duration of the camera device 305 is shorter than a threshold duration. In yet another embodiment, the dynamic amount of sag increases when the change in position of the camera device is smaller than a threshold change along at least one spatial direction.
An IMU of the wearable device 505 may detect translation and rotational motion 510 of the wearable device 505. The detected motion 510 may be utilized at the IMU data processing 515 to determine a movement 520 of the camera device 305 (e.g., an angular movement of the lens assembly 410 and/or the image sensor 455). The determined information about movement 520 may be then processed through one or more motion processing filters 525 (e.g., implemented at the controller 315) to determine a target (or predicted) position 530 along x axis and/or y axis for the lens assembly and/or the image sensor. An actuator control 535 (e.g., applied via the actuator 430 and/or the controller 315) may utilize information about the target position 530 to determine actuation position 540 associated with final position(s) of the lens assembly and/or the image sensor. The actuator control 535 may have a high bandwidth and a small phase delay (i.e., fast settling) to provide fast stabilization of motion for the wearable device 505. The actuator control 535 may also feature a proper handshake with an electronic image stabilization (EIS) of the wearable device 505. The final positions for the lens assembly and/or the image sensor (e.g., within the camera device 305) may result into taking a stabilized image 545.
It should be noted that there is a power vs. performance tradeoff for the stabilization assembly that provides the OIS functionality for the wearable device 505 (and the camera device 305). More stroke associated with the camera device 305 results in better performance. However, power usage of the camera device 305 increases especially when compensating gravity sag. Letting the lens assembly 410 fully sag saves power but might not leave sufficient stroke in one direction. This would have performance impact especially for a longer exposure (integration) of the camera device 305. The solution presented herein is to allow a dynamic sag at the camera device 305 as a function of integration time and/or movement of the camera device 305. If the OIS is required (e.g., with longer integration times or high motion environment), there is a higher probability of running out of stroke. In such case, a less amount of sag is allowed for lens assembly 410 and/or the image sensor 455. On the other hand, for shorter exposures and/or less movement of the camera device 305—more sag is allowed for the lens assembly 410 and/or the image sensor 455.
In some embodiments, the image sensor is fixed 455 and the stabilization assembly of the camera device 305 that provides the OIS functionality shown in
The OIS illustrated in
At 705, the lens assembly is positioned within the camera device in an optical series with an image sensor of the camera device. At a first orientation of the camera device (e.g., at the upward posture of the camera device), there is an offset between a center axis of the image sensor and an optical axis of the lens assembly. At a second orientation of the camera device (e.g., at the forward posture of the camera device), at least one of the image sensor and the lens assembly sag due to gravity such that the center axis and the optical axis substantially overlap (e.g., the center axis and the optical axis are within a threshold offset to one another) while the camera device is in a neutral state. The image sensor may be fixed within the camera device, and the lens assembly may sag due to gravity while the camera device is at the second orientation. Alternatively, the lens assembly may be fixed within the camera device, and the image sensor may sag due to gravity while the camera device is at the second orientation.
At 710, a dynamic amount of sag is allowed for the lens assembly and the image sensor relative to each other. The dynamic amount of sag may be based on information from an OIS assembly of the camera device. The dynamic amount of sag may be a function of at least one of an exposure duration of the camera device and a change in position of the camera device in one or more spatial directions. The dynamic amount of sag may decrease when the exposure duration is longer than a threshold duration and/or the change in position is greater than a threshold change. The dynamic amount of sag may increase when the exposure duration is shorter than a threshold duration and/or the change in position is smaller than a threshold change.
The foregoing description of the embodiments has been presented for illustration; it is not intended to be exhaustive or to limit the patent rights to the precise forms disclosed. Persons skilled in the relevant art can appreciate that many modifications and variations are possible considering the above disclosure.
Some portions of this description describe the embodiments in terms of algorithms and symbolic representations of operations on information. These algorithmic descriptions and representations are commonly used by those skilled in the data processing arts to convey the substance of their work effectively to others skilled in the art. These operations, while described functionally, computationally, or logically, are understood to be implemented by computer programs or equivalent electrical circuits, microcode, or the like. Furthermore, it has also proven convenient at times, to refer to these arrangements of operations as modules, without loss of generality. The described operations and their associated modules may be embodied in software, firmware, hardware, or any combinations thereof.
Any of the steps, operations, or processes described herein may be performed or implemented with one or more hardware or software modules, alone or in combination with other devices. In one embodiment, a software module is implemented with a computer program product comprising a computer-readable medium containing computer program code, which can be executed by a computer processor for performing any or all the steps, operations, or processes described.
Embodiments may also relate to an apparatus for performing the operations herein. This apparatus may be specially constructed for the required purposes, and/or it may comprise a general-purpose computing device selectively activated or reconfigured by a computer program stored in the computer. Such a computer program may be stored in a non-transitory, tangible computer readable storage medium, or any type of media suitable for storing electronic instructions, which may be coupled to a computer system bus. Furthermore, any computing systems referred to in the specification may include a single processor or may be architectures employing multiple processor designs for increased computing capability.
Embodiments may also relate to a product that is produced by a computing process described herein. Such a product may comprise information resulting from a computing process, where the information is stored on a non-transitory, tangible computer readable storage medium and may include any embodiment of a computer program product or other data combination described herein.
Finally, the language used in the specification has been principally selected for readability and instructional purposes, and it may not have been selected to delineate or circumscribe the patent rights. It is therefore intended that the scope of the patent rights be limited not by this detailed description, but rather by any claims that issue on an application based hereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the patent rights, which is set forth in the following claims.
This application claims a priority and benefit to U.S. Provisional Patent Application Ser. No. 63/308,429, filed Feb. 9, 2022, which is hereby incorporated by reference in its entirety.
Number | Date | Country | |
---|---|---|---|
63308429 | Feb 2022 | US |