This disclosure relates generally to position control and more specifically to position management with optical image stabilization in autofocus camera components.
The advent of small, mobile multipurpose devices such as smartphones and tablet or pad devices has resulted in a need for high-resolution, small form factor cameras for integration in the devices. Some small form factor cameras may incorporate optical image stabilization (OIS) mechanisms that may sense and react to external excitation/disturbance by adjusting location of the optical lens on the X and/or Y axis in an attempt to compensate for unwanted motion of the lens. Some small form factor cameras may incorporate an autofocus (AF) mechanism whereby the object focal distance can be adjusted to focus an object plane in front of the camera at an image plane to be captured by the image sensor. In some such autofocus mechanisms, the optical lens is moved as a single rigid body along the optical axis (referred to as the Z axis) of the camera to refocus the camera.
In addition, high image quality is easier to achieve in small form factor cameras if lens motion along the optical axis is accompanied by minimal parasitic motion in the other degrees of freedom, for example on the X and Y axes orthogonal to the optical (Z) axis of the camera. Thus, some small form factor cameras that include autofocus mechanisms may also incorporate optical image stabilization (OIS) mechanisms that may sense and react to external excitation/disturbance by adjusting location of the optical lens on the X and/or Y axis in an attempt to compensate for unwanted motion of the lens.
In some embodiments, a camera actuator includes an actuator base, an autofocus voice coil motor, and an optical image stabilization voice coil motor. In some embodiments, the autofocus voice coil motor includes a lens carrier mounting attachment moveably mounted to the actuator base, a plurality of shared magnets mounted to the base, and an autofocus coil fixedly mounted to the lens carrier mounting attachment for producing forces for moving a lens carrier in a direction of an optical axis of one or more lenses of the lens carrier. In some embodiments, the optical image stabilization voice coil motor includes an image sensor carrier moveably mounted to the actuator base, and a plurality of optical image stabilization coils moveably mounted to the image sensor carrier within the magnetic fields of the shared magnets, for producing forces for moving the image sensor carrier in a plurality of directions orthogonal to the optical axis.
Some embodiments may include a flexure module that may be used in an optical image stabilization VCM actuator (e.g., an optical image stabilization actuator) of a camera. The flexure module may include a dynamic platform and a static platform. In various examples, the flexure module may include one or more flexures. The flexures may be configured to mechanically connect the dynamic platform to the static platform. Furthermore, the flexures may be configured to provide stiffness (e.g., in-plane flexure stiffness) to the VCM actuator while allowing the dynamic platform to move along a plane that is orthogonal to an optical axis defined by one or more lenses of the camera. In various embodiments, the flexure module may include one or more flexure stabilizer members. The flexure stabilizer members may be configured to mechanically connect flexures to each other such that the flexure stabilizer members prevent interference between the flexures that are connected by the flexure stabilizer members. In some examples, the flexure module may include electrical traces configured to convey signals from the dynamic platform to the static platform. The electrical traces may be routed from the dynamic platform to the static platform via flexures, flexure stabilizer members, and/or flex circuits.
This specification includes references to “one embodiment” or “an embodiment.” The appearances of the phrases “in one embodiment” or “in an embodiment” do not necessarily refer to the same embodiment. Particular features, structures, or characteristics may be combined in any suitable manner consistent with this disclosure.
“Comprising.” This term is open-ended. As used in the appended claims, this term does not foreclose additional structure or steps. Consider a claim that recites: “An apparatus comprising one or more processor units . . . .” Such a claim does not foreclose the apparatus from including additional components (e.g., a network interface unit, graphics circuitry, etc.).
“Configured To.” Various units, circuits, or other components may be described or claimed as “configured to” perform a task or tasks. In such contexts, “configured to” is used to connote structure by indicating that the units/circuits/components include structure (e.g., circuitry) that performs those task or tasks during operation. As such, the unit/circuit/component can be said to be configured to perform the task even when the specified unit/circuit/component is not currently operational (e.g., is not on). The units/circuits/components used with the “configured to” language include hardware—for example, circuits, memory storing program instructions executable to implement the operation, etc. Reciting that a unit/circuit/component is “configured to” perform one or more tasks is expressly intended not to invoke 35 U.S.C. § 112, sixth paragraph, for that unit/circuit/component. Additionally, “configured to” can include generic structure (e.g., generic circuitry) that is manipulated by software and/or firmware (e.g., an FPGA or a general-purpose processor executing software) to operate in manner that is capable of performing the task(s) at issue. “Configure to” may also include adapting a manufacturing process (e.g., a semiconductor fabrication facility) to fabricate devices (e.g., integrated circuits) that are adapted to implement or perform one or more tasks.
“First,” “Second,” etc. As used herein, these terms are used as labels for nouns that they precede, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.). For example, a buffer circuit may be described herein as performing write operations for “first” and “second” values. The terms “first” and “second” do not necessarily imply that the first value must be written before the second value.
“Based On.” As used herein, this term is used to describe one or more factors that affect a determination. This term does not foreclose additional factors that may affect a determination. That is, a determination may be solely based on those factors or based, at least in part, on those factors. Consider the phrase “determine A based on B.” While in this case, B is a factor that affects the determination of A, such a phrase does not foreclose the determination of A from also being based on C. In other instances, A may be determined based solely on B.
Introduction to Magnetic Sensing for Autofocus Position Detection
Some embodiments include camera equipment outfitted with controls, magnets, and voice coil motors to improve the effectiveness of a miniature actuation mechanism for a compact camera module. More specifically, in some embodiments, compact camera modules include actuators to deliver functions such as autofocus (AF) and optical image stabilization (OIS). One approach to delivering a very compact actuator for OIS is to use a Voice Coil Motor (VCM) arrangement.
In some embodiments, the optical image stabilization actuator is designed such that the imagining sensor is mounted on an OIS frame which translates in X and Y (as opposed to an autofocus actuator that translates in Z, where Z is the optical axis of the camera). An electro-mechanical component for moving the image sensor is composed of a static and a dynamic platform. Mounting of an imaging sensor (wire bonding, flip/chip, BGA) on the dynamic platform with run out electrical signal traces from the dynamic platform to the static platform provides for connection to the image sensor. In-plane flexures connect the dynamic platform to the static platform and support electrical signal traces. OIS Coils are mounted on the dynamic platform. In some embodiments, OIS permanent magnets are mounted on the static platform to provide additional Lorentz force (e.g. in case of high in-plane flexure stiffness).
Some embodiments include a camera. The camera may include a lens, an image sensor, and a voice coil motor (VCM) actuator. The lens may include one or more lens elements that define an optical axis. The image sensor may be configured to capture light passing through the lens. Furthermore, the image sensor may be configured to convert the captured light into image signals.
In some embodiments, a camera actuator includes an actuator base, an autofocus voice coil motor, and an optical image stabilization voice coil motor. In some embodiments, the autofocus voice coil motor includes a lens carrier mounting attachment moveably mounted to the actuator base, a plurality of shared magnets mounted to the base, and an autofocus coil fixedly mounted to the lens carrier mounting attachment for producing forces for moving a lens carrier in a direction of an optical axis of one or more lenses of the lens carrier. In some embodiments, the optical image stabilization voice coil motor includes an image sensor carrier moveably mounted to the actuator base, and a plurality of optical image stabilization coils moveably mounted to the image sensor carrier within the magnetic fields of the shared magnets, for producing forces for moving the image sensor carrier in a plurality of directions orthogonal to the optical axis.
Some embodiments provide an actuator system using a first AF VCM (voice coil motor), and a second OIS VCM to separately accomplish sensor shift. In some embodiments, the AF VCM actuator allows translation of the optics along the optical axis. In some embodiments, the OIS VCM actuator allows an image sensor to translate in a plane perpendicular to optical axis. In some embodiments, the sensor is mounted on a flat flexure where the electrical traces connecting image sensor and I/O terminals are achieved using an additive metal deposition process (e.g., a high precision additive copper deposition process) directly on the flexure and where the in-plane translation force is a result of a VCM designed around a moving coil architecture.
In some embodiments, the elimination of OIS “optics shift” design that relies on vertical beams (suspension wires) reduces challenges to reliability by relying on the OIS sensor shift and the design of the flat flexure to provide lower required yield strength and larger cross-section, both of which improve reliability.
In some embodiments, shifting the sensor allows reduction of the moving mass, and therefore there is a clear benefit in power consumption in comparison to OIS “optics shift” designs. In some embodiments, manufacturing is accomplished with the electrical traces directly deposited on the OIS flexure (e.g., using an additive metal deposition process), which enables smaller size package while satisfying the I/O requirements.
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically connecting an image sensor, which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor.
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically and electrically connecting an image sensor, which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor, and the flexures include electrical signal traces.
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically and electrically connecting an image sensor, in which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor, and the flexures include metal flexure bodies carrying electrical signal traces electrically isolated from the metal flexure bodies via an insulator (e.g., one or more polyimide insulator layers).
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically and electrically connecting an image sensor, in which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor, and the flexures include metal flexure bodies carrying multiple layers of electrical signal traces electrically isolated from the metal flexure bodies and from one another via an insulator.
In some embodiments, the optical image stabilization coils are mounted on a flexible printed circuit carrying power to the coils for operation of the optical image stabilization voice coil motor.
In some embodiments, the optical image stabilization coils are corner-mounted on a flexible printed circuit mechanically connected to the actuator base and mechanically isolated from the autofocus voice coil motor.
In some embodiments, a bearing surface end stop is mounted to the base for restricting motion of the optical image stabilization voice coil motor.
In some embodiments, a camera includes a lens in a lens carrier, an image sensor for capturing a digital representation of light transiting the lens, an axial motion voice coil motor for focusing light from the lens on the image sensor by moving a lens assembly containing the lens along an optical axis of the lens, and a transverse motion voice coil motor.
In some embodiments, the axial motion voice coil motor includes a suspension assembly for moveably mounting the lens carrier to an actuator base, a plurality of shared magnets mounted to the actuator base, and a focusing coil fixedly mounted to the lens carrier and mounted to the actuator base through the suspension assembly.
In some embodiments, the transverse motion voice coil motor includes an image sensor frame member, one or more flexible members for mechanically connecting the image sensor frame member to a frame of the transverse motion voice coil motor, and a plurality of transverse motion coils moveably mounted to the image sensor frame member within the magnetic fields of the shared magnets, for producing forces for moving the image sensor frame member in a plurality of directions orthogonal to the optical axis.
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically and electrically connecting an image sensor, in which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor, and the flexures include electrical signal traces.
In some embodiments, the image sensor carrier further includes one or more flexible members for mechanically and electrically connecting an image sensor, in which is fixed relative to the image sensor carrier, to a frame of the optical image stabilization voice coil motor, and the flexures include metal flexure bodies carrying electrical signal traces electrically isolated from the metal flexure bodies via an insulator.
In some embodiments, the optical image stabilization coils are mounted on a flexible printed circuit carrying power to the coils for operation of the optical image stabilization voice coil motor.
In some embodiments, the optical image stabilization coils are corner-mounted on a flexible printed circuit mechanically connected to the actuator base and mechanically isolated from the autofocus voice coil motor.
In some embodiments, a bearing surface end stop is mounted to the base for restricting motion of the optical image stabilization voice coil motor.
In some embodiments, a bearing surface end stop is mounted to the actuator base for restricting motion of the image sensor along the optical axis.
Reference will now be made in detail to embodiments, examples of which are illustrated in the accompanying drawings. In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the present disclosure. However, it will be apparent to one of ordinary skill in the art that some embodiments may be practiced without these specific details. In other instances, well-known methods, procedures, components, circuits, and networks have not been described in detail so as not to unnecessarily obscure aspects of the embodiments.
It will also be understood that, although the terms first, second, etc. may be used herein to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first contact could be termed a second contact, and, similarly, a second contact could be termed a first contact, without departing from the intended scope. The first contact and the second contact are both contacts, but they are not the same contact.
The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used in the description and the appended claims, the singular forms “a”, “an” and “the” are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will also be understood that the term “and/or” as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items. It will be further understood that the terms “includes,” “including,” “comprises,” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used herein, the term “if” may be construed to mean “when” or “upon” or “in response to determining” or “in response to detecting,” depending on the context. Similarly, the phrase “if it is determined” or “if [a stated condition or event] is detected” may be construed to mean “upon determining” or “in response to determining” or “upon detecting [the stated condition or event]” or “in response to detecting [the stated condition or event],” depending on the context.
In various embodiments, camera 100 includes a transverse motion (optical image stabilization (OIS)) voice coil motor 120. The transverse motion voice coil motor 120 may include an image sensor frame member 122, one or more flexible members 124 (also referred to herein as “flexures”, “flexure arms”, or “spring arms”) for mechanically connecting the image sensor frame member 122 (also referred to herein as the “dynamic platform” or “inner frame”) to a frame of the transverse motion voice coil motor 126 (also referred to herein as the “static platform” or “outer frame”), and a plurality of OIS coils 132. As indicated in
In some embodiments, the dynamic platform 122, the flexures 124 for mechanically connecting the dynamic platform 122 to the static platform 126, and the static platform 126 are a single metal part or other flexible part. In some embodiments, the flexures 124 mechanically and/or electrically connect an image sensor 108, in which is fixed relative to the dynamic platform 122, to the static platform 126, and the flexures 124 include electrical signal traces 130. In some embodiments, the flexures 124 include metal flexure bodies carrying electrical signal traces 130 electrically isolated from the metal flexure bodies via an insulator.
In some examples, the OIS coils 132 are mounted on a flexible printed circuit (FPC) 134 carrying power to the OIS coils 132 for operation of the transverse motion (OIS) voice coil motor 120. The flexible printed circuit 134, the dynamic platform 136, the flexures 124, and/or the static platform 126 may be connected to a top surface of the actuator base 114 in some embodiments.
In some embodiments, a bearing surface end stop 136 is mounted to the actuator base 114 for restricting motion of the image sensor 108 along the optical axis.
For OIS coil control, in some embodiments control circuitry (not shown) may be positioned on the dynamic platform 122 and/or the FPC 134, and power may be routed to the control circuitry via one or more of the flexures 124. In other instances, the control circuitry may be positioned off of the dynamic platform 122, and stimulation signals may be carried to the OIS coils 132 from the control circuitry via one or more of the flexures 124.
In various examples, the shield can 304 may be mechanically attached to the base 314. The camera 300 may include an axial motion (AF) voice coil motor (VCM) (e.g., axial motion VCM 110 discussed above with reference to
In some embodiments, the OIS FPC 318 and/or the OIS frame 322 may be connected to a bottom surface of the base 314. In some examples, the base 314 may define one or more recesses and/or openings having multiple different cross-sections. For instance, a lower portion of the base 314 may have may define a recess and/or an opening with a cross-section sized to receive the OIS frame 322. An upper portion of the base 314 may define a recess and/or an opening with a cross-section sized to receive the OIS FPC 318. The upper portion may have an inner profile corresponding to the outer profile of the OIS FPC 318. This may help to maximize the amount of material included in the base 314 (e.g., for providing structural rigidity to the base 314) while still providing at least a minimum spacing between the OIS FPC 318 and the base 314.
In some non-limiting examples, the OIS FPC 318 and the image sensor 320 may be separately attached to the OIS frame 322. For instance, a first set of one or more electrical traces may be routed between the OIS FPC 318 and the OIS frame 322. A second, different set of one or more electrical traces may be routed between the image sensor and the OIS frame 322. In other embodiments, the image sensor 320 may be attached to or otherwise integrated into the OIS FPC 318, such that the image sensor 320 is connected to the OIS frame 322 via the OIS FPC 318, e.g., as discussed below with reference to
In some embodiments, the image sensor 604 may be attached to or otherwise integrated into the OIS FPC 606 such that the image sensor 604 is connected to the OIS frame 602 via the OIS FPC 606, e.g., as depicted in
In some examples, the OIS FPC 606 may extend from the dynamic platform 610 such that a portion of the OIS FPC 606 is positioned over the flexures 614 (e.g., in a plane above the flexures 614). In some examples, at least a portion of each of the OIS coils 608 to be positioned above the flexures 614. Such an arrangement may facilitate miniaturization of the transverse motion VCM 600 and/or the camera, as the dynamic platform 610 need not be sized to accommodate both the image sensor 604 and the OIS coils 608.
In some embodiments, the OIS frame may include multiple flexure groups. Each flexure group may be multiple flexures 708 that are connected to a common side of the dynamic platform 704 and a common side of the static platform 706. In some examples, at least one flexure group is configured such that the flexures 708 connect to a side of the dynamic platform 704 and a non-corresponding side of the static platform 706. That is, the each side of the dynamic platform 704 may face a corresponding side of the static platform 706, and the flexure 708 may include at least one bend or curve to traverse a corner to reach a non-corresponding side of the static platform 706, e.g., as depicted in
In some cases, one or more sides of the dynamic platform 704 and/or the static platform 706 may include multiple flexure groups. For instance, as shown in
In some embodiments, the flexure module 800 may be used in a transverse motion (optical image stabilization) voice coil motor of a camera (e.g., the cameras described above with reference to
In various examples, the flexure module 800 may include one or more flexures 806. The flexures 806 may be configured to mechanically connect the dynamic platform 802 to the static platform 804. The flexures 806 may be configured to provide stiffness (e.g., in-plane flexure stiffness) to the VCM actuator while allowing the dynamic platform 802 (and an image sensor fixed relative to the dynamic platform 802) to move along a plane that is orthogonal to an optical axis defined by one or more lenses of a camera. In this manner, the image sensor may be shifted along the plane that is orthogonal to the optical axis to provide optical image stabilization functionality. Furthermore, as described herein with reference to
In various embodiments, the flexure module 800 may include one or more flexure stabilizer members 808. The flexure stabilizer members 808 may be configured to mechanically connect flexures 806 to each other such that the flexure stabilizer members 808 prevent interference between the flexures 806 that are connected by the flexure stabilizer members 808. For instance, the flexure stabilizer members 808 may be configured to prevent the flexures 806 from colliding and/or entangling with one another, e.g., in drop events, vibration events, etc. Additionally, or alternatively, the flexure stabilizer members 808 may be configured to limit motion of, and/or stabilize relative motion between, the flexures 806 that are connected by the flexure stabilizer members 808. Furthermore, the flexure stabilizer members 808 may be arranged along various portions of the flexures 806 to provide in-plane stiffness as needed in the flexure module 800, e.g., to satisfy optical image stabilization design requirements. Some non-limiting examples of flexure stabilizer member configurations are described below with reference to
In some embodiments, the flexures 806 may be arranged in one or more flexure groups 810, or arrays, that individually include multiple flexures 806. For instance, as depicted in
In some examples, the dynamic platform 802 and/or the static platform 804 may include one or more offsets 812 (e.g., a recess, an extension, etc.). In some cases, one or more flexures 806 may connect to the dynamic platform 802 and/or the static platform 804 at an offset 812. For instance, as illustrated in
The example flexure module configurations of
In some embodiments, an extension and/or a recess of a dynamic platform and/or a static platform may change the direction that the flexures attach relative to the dynamic platform and/or the static platform. For example, in
In some examples, the second extension of the static platform 904c may be used to provide additional space for routing of traces (e.g., for grounding of guard traces, as discussed below with reference to
In some examples, certain electrical traces (and the signals they carry) may be susceptible to physical deformations. In instances where different electrical traces have different thicknesses and/or strengths, the electrical traces may be routed along different flexures 906d and/or different types of flexures 906d, e.g., based at least in part on sensitivity of the electrical traces. For example, in
With respect to flexures, some of the example flexure module configurations of
(1a) The number of flexures may vary. For instance, a flexure module may include one or multiple flexures. In a particular example, a flexure module may include three to six flexures in a flexure group. As a non-limiting examples, the flexure group shown in
(2a) The flexures may be parallel to each other. For instance, the flexure groups shown in
(3a) The flexures may be parallel to a frame edge (e.g., an edge of a dynamic platform and/or a static platform of a flexure module). For instance, each of
(4a) The flexures may be evenly spaced apart from each other. As a non-limiting example, the flexure groups shown in
(5a) A width of a flexures may vary along the flexures and/or among the flexures. For instance, the flexure group shown in
(6a) The flexures may include features (e.g., a recess, an extension, an aperture, etc.). For instance, the flexure group shown in
(7a) A cross-section of the flexures may be rectangular, concave, and/or convex in shape, e.g., as discussed below with reference to
(8a) The flexures may be a solid material, clad, or switched beam, e.g., as discussed below with reference to
With respect to bends of the flexures (or flexure groups), some of the example flexure module configurations of
(1b) The flexures may include one or more bends. For example, the flexures shown in
(2b) A turning angle of the bends may vary. In some examples, the turning angle may be 90 degrees. For instance, the flexure group shown in
(3b) The turning radii of the bends may vary. For example, the flexure groups shown in
With respect to flexure stabilizer members, some of the example flexure module configurations of
(1c) One or more flexure stabilizer members may connect the flexures. For example, in
(2c) A flexure stabilizer member may connect some or all of the flexures. For instance, in
(3c) The locations of the flexure stabilizer members may be anywhere on the flexures. In some examples, the locations of the flexure stabilizer members may be different among the flexures. For instance, in
(4c) An angle between the flexure stabilizer members and the flexures may vary. In some examples, the angle between the flexure stabilizer member and the flexures may be 90 degrees, e.g., as shown in
With respect to offsets of the dynamic platform and/or the static platform, some of the example flexure module configurations of
(1d) An offset may exist at a flexure root where flexures connect to the dynamic platform and/or the static platform, e.g., as shown in
(2d) The offset may be, for example, a recess, an extrusion, etc. For instance,
With respect to flexure connecting angles to the dynamic platform and/or the static platform, some of the example flexure module configurations of
(1e) The flexure connecting angles may vary. In some examples, a flexure connecting angle may be 90 degrees, e.g., as shown in
(2e) Different flexures may have different flexure connecting angles, e.g., as shown in
(3e) For dynamic platforms and/or static platforms with an offset, the flexures may be connected to any available edge of the offset. In some cases, the flexures may be connected to an edge of the offset that is parallel to the side of the dynamic platform or the static platform that defines the offset, e.g., as shown in
With respect to flexure patterns (which, in some cases, may include a pattern formed by the flexures and the flexure stabilizer members), some of the example flexure module configurations of
(1f) The flexure pattern may be symmetric. For instance, the flexure pattern may be symmetric along at least two axes (e.g., the x and y axes) that are orthogonal to the optical axis. For example, the flexure patterns shown in
(1g) The flexure pattern may be asymmetric. For instance, the flexure pattern may be asymmetric along at least one axis (e.g., the x axis or the y axis) that is orthogonal to the optical axis. For instance, a flexure pattern may include multiple different ones of the flexure module configurations shown in
In some embodiments, an OIS frame may be formed from a conductive material (e.g., a copper alloy, stainless steel, or the like), such that the flexures themselves may act as a ground path between the static platform and the dynamic platform. Additionally, grounding traces may be added to shield high-frequency lines (e.g., dual pair lines that carry image signals from an image sensor to an image signal processor). For example, each of the first flexure 1002i and the second flexure 1004 may include signal traces 1010i (e.g., two signal traces, as shown in
In some examples, it may be desirable to selectively choose which traces are placed on different flexures. For example, in instances where one trace and a flexure carries a power signal to the dynamic platform (e.g., to the image sensor and/or OIS control circuitry) and traces on another flexure carry image signals from the image sensor, it may be desirable to position a ground trace on a flexure between the power-carrying flexure and the signal-carrying flexure. For instance, the third flexure 1006i may include a ground trace 1016i, and the fourth flexure 1008i may include a power trace 1018i. The third flexure 1006i (which includes the ground trace 1016i) may be positioned between the second flexure 1004i (which may carry image signals via the signal traces 1010i) and the fourth flexure 1008i (which may carry power via the power trace 1008i). In some cases, the ground trace 1016i may be a reference different from the grounding of the OIS frame itself. Additionally, in some instances it may be desirable to route one or more power traces along the shortest flexure on the OIS frame. Similarly, image-carrying traces may also be prioritized for shorter flexures, while other traces (e.g., carrying information between the OIS frame and the axial motion (autofocus) voice coil motor (VCM) actuator) may have longer trace lengths.
In various embodiments, one or more of the flexure stabilizer members described herein (e.g., with reference to
The electrical connection elements 1110 may be disposed along one or more portions (or sides), of the dynamic platform 1102. For instance, the electrical connection elements 1110 may be disposed along one or more flexure roots at which the flexures 1106 connect to the dynamic platform 1102. Likewise, the electrical connection elements 1112 may be disposed along one or more portions (or sides) of the static platform 1104. For instance, the electrical connection elements 1112 may be disposed along one or more flexure roots at which the flexures 1106 connect to the static platform 1104. In some examples, the electrical connection elements 1110 may be configured to electrically couple with an image sensor and/or another component (e.g., a flip chip, a substrate, etc.) that is coupled to the image sensor. Accordingly, the dynamic platform 1102 may be configured to receive signals (e.g., image signals) from the image sensor via the electrical connection elements 1110, and the signals may be conveyed from the electrical connection elements 1110 of the dynamic platform 1102 to the electrical connection elements 1112 of the static platform 1104 via one or more electrical traces routed along the flexures 1106 and/or the flexure stabilizer members 1108.
In
In some embodiments, when there are electrical traces on both sides of a flexure 1106 (e.g., as indicated in
In some examples, the electrical traces 1204 may be routed, via one or more flex circuits 1202, from one or more electrical connection elements 1210 disposed on the dynamic platform 1206 to one or more electrical connection elements 1212 disposed on the static platform 1208.
The electrical connection elements 1210 may be disposed along one or more portions (or sides), of the dynamic platform 1206. Likewise, the electrical connection elements 1212 may be disposed along one or more portions (or sides) of the static platform 1208. In some examples, the electrical connection elements 1210 may be configured to electrically couple with an image 00000sensor and/or another component (e.g., a flip chip, a substrate, etc.) that is coupled to the image sensor. Accordingly, the dynamic platform 1206 may be configured to receive signals (e.g., image signals) from the image sensor via the electrical connection elements 1210, and the signals may be conveyed from the electrical connection elements 1210 of the dynamic platform 1206 to the electrical connection elements 1212 of the static platform 1208 via one or more electrical traces 1204 routed along one or more flex circuits 1202.
In some embodiments, a flex circuit 1202 may include a first end that is fixed (e.g., via an adhesive) to the dynamic platform 1206, a second end that is fixed (e.g., via an adhesive) to the static platform 1208, and a middle portion between the first end and the second end. The second end of the flex circuit 1202 may be opposite the first end of the flex circuit 1202. Furthermore, in some embodiments, the middle portion of the flex circuit 1202 may include an amount of slack that facilitates relative movement between the first and second ends of the flex circuit 1202. The amount of slack may be determined based at least in part on a stiffness of the flexure module 1200. Moreover, in various embodiments, the flex circuits 1202 may include a flexible material. In some embodiments, multiple flex circuits 1202 may be disposed in proximity with one another to form to a flex circuit array.
In some examples, in addition to routing electrical traces via one or more flex circuits 1204, the flexure module 1200 may route electrical traces via the flexures 1214 and/or the flexure stabilizer members 1216, e.g., as described above with reference to
Camera Module Reduction with Smaller Flexure Platform and Reconfigured VCM
Camera module designs may occupy a significant footprint and/or occupy a significant volume on and/or within an electronic device. Thus, reducing a size of a camera module may provide additional space within an electronic device without increasing a size of the electronic device. In some aspects, x-y dimensions of a flexure platform may be reduced to reduce the size of the camera module. The flexure platform dimensions may be reduced in a variety of ways including arm count reduction and material additive processes. However, reconfiguration of the VCM architecture within the camera module may be needed to accommodate the reduced flexure platform size and reduced camera module size without changing a size of the image sensor and/or the optical assembly to maintain camera performance.
As described herein, a size of a flexure platform may be reduced using arm count reduction. For reference, turning back to
Additionally, or alternatively, a size of a flexure platform may be reduced using one or more material additive processes. Material additive processes (e.g., electroforming, electroplating) may be used to reduce the arm pitch of the flexure arms (e.g., reduce a distance between flexure arms). By reducing the pitch of the flexure arms, the size of the flexure platform in the x-y directions may be reduced. Details related to material additive processes may be found at least in U.S. patent application Ser. No. 17/399,917 that is herein incorporated by reference in its entirety. It should be understood that while arm count reduction and material additive processes may be implemented to reduce a width of the flexure platform, one or more other flexure platform reduction techniques may be implemented, alone or combination with arm count reduction and/or material additive processes, to reduce a size of the flexure platform.
By reducing the flexure platform to the flexure platform 1450 illustrated in
Six OIS coils 1606 may be vertically aligned with and below each of the six magnets 1604. The OIS coils 1606 may also each have a generally rectangular-like shape (e.g., a bar-like shape, bar-shaped) occupying less space compared to, for example, trapezoidal OIS coils to accommodate a reduced size camera module and a reduced sized flexure platform as described herein. A space formed between the two OIS coils 1606 on the left and right sides adjacent the AF coil 1602 are configured to receive the protrusions 1608 so that the OIS coils 1606 avoid obstructing movement of an optics assembly as described herein. In some aspects, the OIS coils 1606 may be coupled to position sensor (e.g., hall sensors) for detecting a position and/or movement of the image sensor 1704 in the x-y directions. In some aspects, spaces formed between to the two OIS coils 1606 on the left and right sides adjacent the AF coil 1602 may be used for additional circuit boards and/or position sensors. In some aspects, the OIS coils 1606 may have a single OIS coil layer or only two OIS coil layers. However, to accommodate the reduced size camera module and/or the reduced size flexure platform while maintaining a size of the optical sensor and/or the optics package, the OIS coils 1606 may have three or more OIS coil layers. For example, in some aspects, one or more of the OIS coils 1606 may have three OIS coil layers, four OIS coil layers, or more OIS coil layers. Three or more OIS coil layers may allow for a smaller OIS coil footprint while minimizing a decrease in effectiveness of the OIS coils 1606. Three or more OIS coil layers may allow for a smaller OIS coil footprint while maintaining an effectiveness of the OIS coils 1606. In some aspects, three or more OIS coil layers may allow for a smaller OIS coil footprint while increasing an effectiveness of the OIS coils 1606. Additionally, or alternatively, three or more OIS coil layers may allow for the same number of OIS coil turns with a smaller OIS coil footprint or volume to minimizing a decrease (e.g., reduce a decrease, maintain, or increase) in effectiveness of the OIS coils 1606.
In addition, the camera module 1800 may include a damping pin assembly 1811 and associated damping structures 1812, an autofocus (AF) coil 1814, and a plurality of magnets 1502. Due the position of the magnets 1502 in the corners of the camera module 1800, the damping pin assembly 1811 may extend across a portion of the camera module perimeter 1802 from a side of the camera module 1800, between two corners, and, thus, between two magnets 1502 for pins of the damping assembly 1811 to engage with the damping structures 1812. The damping structures 1812 may include a gel-like material that is engaged with an optics assembly when an optics assembly is located in the optics assembly space 1808. The damping pin assembly 1811 and associated damping structure 1812 may provide damping for AF movement of the optics assembly moving along an optical axis of the optics assembly (e.g., the z-direction). The AF coil 1814 may be positioned around the optics assembly space 1808 and form an octagonal shape. The four magnets 1502 may each have a trapezoidal shape and may be positioned outside the AF coil 1814 and in a respective corner of the four corners of the camera module 1802. The AF coil 1814 and the magnets 1502 may together be configured to drive an optical assembly in the z-axis to control autofocus movement as described herein. The camera module 1800 may also include one or more PCBs 1508 and one or more position sensors 1510 (e.g., AF position sensors). It should be understood that while the AF coil 1814 illustrated in
The suspension assembly 1856 of
The camera module 1850 may include a damping pin assembly 1861 and associated damping structures 1862, the AF coil 1602, and the magnets 1604. The damping pin assembly 1861 may extend across a portion of the camera module 1850 for pins of the damping assembly 1861 to engage with the damping structures 1862. The damping pins assembly 1861 may be sized and configured to accommodate for the spatial constraints of the VCM architecture of the camera module 1850. For example, the damping pin assembly 1861 may include a static portion 1861a extending along a side of the camera module 1850 proximate a first side of one of the stationary magnets 1604. A first damping arm 1861b may extend from the static portion 1861a to a first damping structure 1862 at the lens carrier. A second damping arm 1861c may extend from the static portion 1861a to a second damping structure 1862 at the lens carrier. The first damping arm 1861b may extend proximate a second side of the one of the stationary magnet 1604, and the second damping arm 1861c may extend proximate a third side of the one of the stationary magnet 1604 opposite the second side of the one of the stationary magnets 1604. The damping structures 1862 may include a gel-like material that engages an optics assembly (e.g., a lens carrier) when an optics assembly is located in the optics assembly space 1808. The damping pin assembly 1861 and associated damping structure 1862 may provide damping for AF movement of the optics assembly moving along an optical axis of the optics assembly (e.g., the z-direction). The camera module 1850 may also include end stop 1864 to dampen and limit the movement of the image sensor in the x-y directions and to dampen and limit the movement of the optics assembly in the z-direction. In other words, each of the end stops 1864 may provide a limit to movement of the AF carrier 1906 (illustrated in
As described herein, the AF coil 1602 may be positioned around the optics assembly space 1808 and form a shape (e.g., a rectangular-like shape, a square-like shape) around the perimeter of the optics assembly space 1808. In some aspects, the AF coil 1602 may also include protrusions 1608 extending from the generally rectangular shape of the AF coil 1602 on one or more sides. The protrusions 1608 may enable the optics assembly space 1808 to fit completely within the AF coil 1602 without the AF coil 1864 obstructing the optics assembly space 1808. The six magnets 1606 may each have a rectangular-like shape (e.g., a bar-like shape, bar-shaped) and may be positioned outside the AF coil 1602 adjacent the sides of the AF coil 1602. The rectangular-like shaped magnets 1602 may occupy less space compared, for example, to trapezoidal magnets to accommodate the reduced size camera module 1850. The magnets 1604 may be orthogonally positioned around the AF coil 1602. For example, as shown in
In various examples, the shield can 1904 may be mechanically attached to the base 1930. The camera 1900 may include an axial motion (AF) voice coil motor (VCM) (e.g., axial motion VCM discussed herein with reference to
In some embodiments, the OIS base 1928 may be connected to a bottom surface of the enclosure 1930. In some examples, the enclosure 1930 may define one or more recesses and/or openings having multiple different cross-sections. For instance, a lower portion of the enclosure 1930 may have may define a recess and/or an opening with a cross-section sized to receive an OIS to frame. An upper portion of the enclosure 1930 may define a recess and/or an opening with a cross-section sized to receive the flexure platform 1500. The upper portion may have an inner profile corresponding to the outer profile of the flexure platform 1500. This may help to maximize the amount of material included in the enclosure 1930 (e.g., for providing structural rigidity to the enclosure 1930) while still providing at least a minimum spacing between the flexure platform and the enclosure 1930.
In some non-limiting examples, the flexure platform 1500 and the image sensor 1504 may be separately attached to the OIS frame. For instance, a first set of one or more electrical traces may be routed between the flexure platform 1500 and the OIS frame. A second, different set of one or more electrical traces may be routed between the image sensor 1504 and the OIS frame. In other embodiments, the image sensor 1504 may be attached to or otherwise integrated into the flexure platform 1500, such that the image sensor 1504 is connected to the OIS frame via the flexure platform.
Embodiments of electronic devices, user interfaces for such devices, and associated processes for using such devices are described. In some embodiments, the device is a portable communications device, such as a mobile telephone, that also contains other functions, such as PDA and/or music player functions. Other portable electronic devices, such as laptops, cameras, cell phones, or tablet computers, may also be used. It should also be understood that, in some embodiments, the device is not a portable communications device, but is a desktop computer with a camera. In some embodiments, the device is a gaming computer with orientation sensors (e.g., orientation sensors in a gaming controller). In other embodiments, the device is not a portable communications device, but is a camera.
In the discussion that follows, an electronic device that includes a display and a touch-sensitive surface is described. It should be understood, however, that the electronic device may include one or more other physical user-interface devices, such as a physical keyboard, a mouse and/or a joystick.
The device typically supports a variety of applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disk authoring application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an e-mail application, an instant messaging application, a workout support application, a photo management application, a digital camera application, a digital video camera application, a web browsing application, a digital music player application, and/or a digital video player application.
The various applications that may be executed on the device may use at least one common physical user-interface device, such as the touch-sensitive surface. One or more functions of the touch-sensitive surface as well as corresponding information displayed on the device may be adjusted and/or varied from one application to the next and/or within a respective application. In this way, a common physical architecture (such as the touch-sensitive surface) of the device may support the variety of applications with user interfaces that are intuitive and transparent to the user.
Attention is now directed toward embodiments of portable devices with cameras.
It should be appreciated that device 2000 is only one example of a portable multifunction device, and that device 2000 may have more or fewer components than shown, may combine two or more components, or may have a different configuration or arrangement of the components. The various components shown in
Memory 2002 may include high-speed random access memory and may also include non-volatile memory, such as one or more magnetic disk storage devices, flash memory devices, or other non-volatile solid-state memory devices. Access to memory 2002 by other components of device 2000, such as CPU 2020 and the peripherals interface 2018, may be controlled by memory controller 2022.
Peripherals interface 2018 can be used to couple input and output peripherals of the device to CPU 2020 and memory 2002. The one or more processors 2020 run or execute various software programs and/or sets of instructions stored in memory 2002 to perform various functions for device 2000 and to process data.
In some embodiments, peripherals interface 2018, CPU 2020, and memory controller 2022 may be implemented on a single chip, such as chip 2004. In some other embodiments, they may be implemented on separate chips.
RF (radio frequency) circuitry 2008 receives and sends RF signals, also called electromagnetic signals. RF circuitry 2008 converts electrical signals to/from electromagnetic signals and communicates with communications networks and other communications devices via the electromagnetic signals. RF circuitry 2008 may include well-known circuitry for performing these functions, including but not limited to an antenna system, an RF transceiver, one or more amplifiers, a tuner, one or more oscillators, a digital signal processor, a CODEC chipset, a subscriber identity module (SIM) card, memory, and so forth. RF circuitry 2008 may communicate with networks, such as the Internet, also referred to as the World Wide Web (WWW), an intranet and/or a wireless network, such as a cellular telephone network, a wireless local area network (LAN) and/or a metropolitan area network (MAN), and other devices by wireless communication. The wireless communication may use any of a variety of communications standards, protocols and technologies, including but not limited to Global System for Mobile Communications (GSM), Enhanced Data GSM Environment (EDGE), high-speed downlink packet access (HSDPA), high-speed uplink packet access (HSUPA), wideband code division multiple access (W-CDMA), code division multiple access (CDMA), time division multiple access (TDMA), Bluetooth, Wireless Fidelity (Wi-Fi) (e.g., IEEE 802.11a, IEEE 802.11b, IEEE 802.11g and/or IEEE 802.11n), voice over Internet Protocol (VoIP), Wi-MAX, a protocol for e-mail (e.g., Internet message access protocol (IMAP) and/or post office protocol (POP)), instant messaging (e.g., extensible messaging and presence protocol (XMPP), Session Initiation Protocol for Instant Messaging and Presence Leveraging Extensions (SIMPLE), Instant Messaging and Presence Service (IMPS)), and/or Short Message Service (SMS), or any other suitable communication protocol, including communication protocols not yet developed as of the filing date of this document.
Audio circuitry 2010, speaker 2011, and microphone 2013 provide an audio interface between a user and device 2000. Audio circuitry 2010 receives audio data from peripherals interface 2018, converts the audio data to an electrical signal, and transmits the electrical signal to speaker 2011. Speaker 2011 converts the electrical signal to human-audible sound waves. Audio circuitry 2010 also receives electrical signals converted by microphone 2013 from sound waves. Audio circuitry 2010 converts the electrical signal to audio data and transmits the audio data to peripherals interface 2018 for processing. Audio data may be retrieved from and/or transmitted to memory 2002 and/or RF circuitry 2008 by peripherals interface 2018. In some embodiments, audio circuitry 2010 also includes a headset jack (e.g., 1812,
I/O subsystem 2006 couples input/output peripherals on device 2000, such as touch screen 2012 and other input control devices 2016, to peripherals interface 2018. I/O subsystem 2006 may include display controller 2056 and one or more input controllers 2060 for other input or control devices. The one or more input controllers 2060 receive/send electrical signals from/to other input or control devices 2016. The other input control devices 2016 may include physical buttons (e.g., push buttons, rocker buttons, etc.), dials, slider switches, joysticks, click wheels, and so forth. In some alternate embodiments, input controller(s) 2060 may be coupled to any (or none) of the following: a keyboard, infrared port, USB port, and a pointer device such as a mouse. The one or more buttons (e.g., 1808,
Touch-sensitive display 2012 provides an input interface and an output interface between the device and a user. Display controller 2056 receives and/or sends electrical signals from/to touch screen 2012. Touch screen 2012 displays visual output to the user. The visual output may include graphics, text, icons, video, and any combination thereof (collectively termed “graphics”). In some embodiments, some or all of the visual output may correspond to user-interface objects.
Touch screen 2012 has a touch-sensitive surface, sensor or set of sensors that accepts input from the user based on haptic and/or tactile contact. Touch screen 2012 and display controller 2056 (along with any associated modules and/or sets of instructions in memory 2002) detect contact (and any movement or breaking of the contact) on touch screen 2012 and converts the detected contact into interaction with user-interface objects (e.g., one or more soft keys, icons, web pages or images) that are displayed on touch screen 2012. In an example embodiment, a point of contact between touch screen 2012 and the user corresponds to a finger of the user.
Touch screen 2012 may use LCD (liquid crystal display) technology, LPD (light emitting polymer display) technology, or LED (light emitting diode) technology, although other display technologies may be used in other embodiments. Touch screen 2012 and display controller 2056 may detect contact and any movement or breaking thereof using any of a variety of touch sensing technologies now known or later developed, including but not limited to capacitive, resistive, infrared, and surface acoustic wave technologies, as well as other proximity sensor arrays or other elements for determining one or more points of contact with touch screen 2012. In an example embodiment, projected mutual capacitance sensing technology is used.
Touch screen 2012 may have a video resolution in excess of 800 dpi. In some embodiments, the touch screen has a video resolution of approximately 860 dpi. The user may make contact with touch screen 2012 using any suitable object or appendage, such as a stylus, a finger, and so forth. In some embodiments, the user interface is designed to work primarily with finger-based contacts and gestures, which can be less precise than stylus-based input due to the larger area of contact of a finger on the touch screen. In some embodiments, the device translates the rough finger-based input into a precise pointer/cursor position or command for performing the actions desired by the user.
In some embodiments, in addition to the touch screen, device 2000 may include a touchpad (not shown) for activating or deactivating particular functions. In some embodiments, the touchpad is a touch-sensitive area of the device that, unlike the touch screen, does not display visual output. The touchpad may be a touch-sensitive surface that is separate from touch screen 2012 or an extension of the touch-sensitive surface formed by the touch screen.
Device 2000 also includes power system 2062 for powering the various components. Power system 2062 may include a power management system, one or more power sources (e.g., battery, alternating current (AC)), a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator (e.g., a light-emitting diode (LED)) and any other components associated with the generation, management and distribution of power in portable devices.
Device 2000 may also include one or more optical sensors or cameras 2064.
Device 2000 may also include one or more proximity sensors 2066.
Device 2000 includes one or more orientation sensors 2068. In some embodiments, the one or more orientation sensors 2068 include one or more accelerometers (e.g., one or more linear accelerometers and/or one or more rotational accelerometers). In some embodiments, the one or more orientation sensors 2068 include one or more gyroscopes. In some embodiments, the one or more orientation sensors 2068 include one or more magnetometers. In some embodiments, the one or more orientation sensors 2068 include one or more of global positioning system (GPS), Global Navigation Satellite System (GLONASS), and/or other global navigation system receivers. The GPS, GLONASS, and/or other global navigation system receivers may be used for obtaining information concerning the location and orientation (e.g., portrait or landscape) of device 2000. In some embodiments, the one or more orientation sensors 2068 include any combination of orientation/rotation sensors.
In some embodiments, the software components stored in memory 2002 include operating system 2026, communication module (or set of instructions) 2028, contact/motion module (or set of instructions) 2030, graphics module (or set of instructions) 2032, text input module (or set of instructions) 2034, Global Positioning System (GPS) module (or set of instructions) 2035, arbiter module 2058 and applications (or sets of instructions) 2036. Furthermore, in some embodiments memory 2002 stores device/global internal state 2057. Device/global internal state 2057 includes one or more of: active application state, indicating which applications, if any, are currently active; display state, indicating what applications, views or other information occupy various regions of touch screen display 2012; sensor state, including information obtained from the device's various sensors and input control devices 2016; and location information concerning the device's location and/or attitude.
Operating system 2026 (e.g., Darwin, RTXC, LINUX, UNIX, OS X, WINDOWS, or an embedded operating system such as VxWorks) includes various software components and/or drivers for controlling and managing general system tasks (e.g., memory management, storage device control, power management, etc.) and facilitates communication between various hardware and software components.
Communication module 2028 facilitates communication with other devices over one or more external ports 2024 and also includes various software components for handling data received by RF circuitry 2008 and/or external port 2024. External port 2024 (e.g., Universal Serial Bus (USB), FIREWIRE, etc.) is adapted for coupling directly to other devices or indirectly over a network (e.g., the Internet, wireless LAN, etc.). In some embodiments, the external port is a multi-pin (e.g., 30-pin) connector.
Contact/motion module 2030 may detect contact with touch screen 2012 (in conjunction with display controller 2056) and other touch sensitive devices (e.g., a touchpad or physical click wheel). Contact/motion module 2030 includes various software components for performing various operations related to detection of contact, such as determining if contact has occurred (e.g., detecting a finger-down event), determining if there is movement of the contact and tracking the movement across the touch-sensitive surface (e.g., detecting one or more finger-dragging events), and determining if the contact has ceased (e.g., detecting a finger-up event or a break in contact). Contact/motion module 2030 receives contact data from the touch-sensitive surface. Determining movement of the point of contact, which is represented by a series of contact data, may include determining speed (magnitude), velocity (magnitude and direction), and/or an acceleration (a change in magnitude and/or direction) of the point of contact. These operations may be applied to single contacts (e.g., one finger contacts) or to multiple simultaneous contacts (e.g., “multitouch”/multiple finger contacts). In some embodiments, contact/motion module 2030 and display controller 2056 detect contact on a touchpad.
Contact/motion module 2030 may detect a gesture input by a user. Different gestures on the touch-sensitive surface have different contact patterns. Thus, a gesture may be detected by detecting a particular contact pattern. For example, detecting a finger tap gesture includes detecting a finger-down event followed by detecting a finger-up (lift off) event at the same position (or substantially the same position) as the finger-down event (e.g., at the position of an icon). As another example, detecting a finger swipe gesture on the touch-sensitive surface includes detecting a finger-down event followed by detecting one or more finger-dragging events, and subsequently followed by detecting a finger-up (lift off) event.
Graphics module 2032 includes various known software components for rendering and displaying graphics on touch screen 2012 or other display, including components for changing the intensity of graphics that are displayed. As used herein, the term “graphics” includes any object that can be displayed to a user, including without limitation text, web pages, icons (such as user-interface objects including soft keys), digital images, videos, animations and the like.
In some embodiments, graphics module 2032 stores data representing graphics to be used. Each graphic may be assigned a corresponding code. Graphics module 2032 receives, from applications etc., one or more codes specifying graphics to be displayed along with, if necessary, coordinate data and other graphic property data, and then generates screen image data to output to to display controller 2056.
Text input module 2034, which may be a component of graphics module 2032, provides soft keyboards for entering text in various applications (e.g., contacts 2037, e-mail 2040, IM 2041, browser 2047, and any other application that needs text input).
GPS module 2035 determines the location of the device and provides this information for use in various applications (e.g., to telephone 2038 for use in location-based dialing, to camera 2043 as picture/video metadata, and to applications that provide location-based services such as weather widgets, local yellow page widgets, and map/navigation widgets).
Applications 2036 may include the following modules (or sets of instructions), or a subset or superset thereof:
Examples of other applications 2036 that may be stored in memory 2002 include other word processing applications, other image editing applications, drawing applications, presentation applications, JAVA-enabled applications, encryption, digital rights management, voice recognition, and voice replication.
Each of the above identified modules and applications correspond to a set of executable instructions for performing one or more functions described above and the methods described in this application (e.g., the computer-implemented methods and other information processing methods described herein). These modules (i.e., sets of instructions) need not be implemented as separate software programs, procedures or modules, and thus various subsets of these modules may be combined or otherwise re-arranged in various embodiments. In some embodiments, memory 2002 may store a subset of the modules and data structures identified above. Furthermore, memory 2002 may store additional modules and data structures not described above.
In some embodiments, device 2000 is a device where operation of a predefined set of functions on the device is performed exclusively through a touch screen and/or a touchpad. By using a touch screen and/or a touchpad as the primary input control device for operation of device 2000, the number of physical input control devices (such as push buttons, dials, and the like) on device 2000 may be reduced.
The predefined set of functions that may be performed exclusively through a touch screen and/or a touchpad include navigation between user interfaces. In some embodiments, the touchpad, when touched by the user, navigates device 2000 to a main, home, or root menu from any user interface that may be displayed on device 2000. In such embodiments, the touchpad may be referred to as a “menu button.” In some other embodiments, the menu button may be a physical push button or other physical input control device instead of a touchpad.
Device 2000 may also include one or more physical buttons, such as “home” or menu button 2104. As described previously, menu button 2104 may be used to navigate to any application 2036 in a set of applications that may be executed on device 2100. Alternatively, in some embodiments, the menu button 2104 is implemented as a soft key in a GUI displayed on touch screen 2012.
In one embodiment, device 2100 includes touch screen 2012, menu button 2104, push button 2106 for powering the device on/off and locking the device, volume adjustment button(s) 2108, Subscriber Identity Module (SIM) card slot 2110, head set jack 2112, and docking/charging external port 2124. Push button 2106 may be used to turn the power on/off on the device by depressing the button and holding the button in the depressed state for a predefined time interval; to lock the device by depressing the button and releasing the button before the predefined time interval has elapsed; and/or to unlock the device or initiate an unlock process. In an alternative embodiment, device 2100 also may accept verbal input for activation or deactivation of some functions through microphone 2013.
It should be noted that, although many of the examples herein are given with reference to optical sensor/camera 2064 (on the front of a device), a rear-facing camera or optical sensor that is pointed opposite from the display may be used instead of or in addition to an optical sensor/camera 2064 on the front of a device.
Example Computer System
Various embodiments of a camera motion control system as described herein, including embodiments of magnetic position sensing, as described herein may be executed in one or more computer systems 2200, which may interact with various other devices. Note that any component, action, or functionality described above with respect to
In various embodiments, computer system 2200 may be a uniprocessor system including one processor 2210, or a multiprocessor system including several processors 2210 (e.g., two, four, eight, or another suitable number). Processors 2210 may be any suitable processor capable of executing instructions. For example, in various embodiments processors 2210 may be general-purpose or embedded processors implementing any of a variety of instruction set architectures (ISAs), such as the ×86, PowerPC, SPARC, or MIPS ISAs, or any other suitable ISA. In multiprocessor systems, each of processors 2210 may commonly, but not necessarily, implement the same ISA.
System memory 2220 may be configured to store camera control program instructions 2222 and/or camera control data accessible by processor 2210. In various embodiments, system memory 2220 may be implemented using any suitable memory technology, such as static random access memory (SRAM), synchronous dynamic RAM (SDRAM), nonvolatile/Flash-type memory, or any other type of memory. In the illustrated embodiment, program instructions 2222 may be configured to implement a lens control application 2224 incorporating any of the functionality described above. Additionally, existing camera control data 2232 of memory 2220 may include any of the information or data structures described above. In some embodiments, program instructions and/or data may be received, sent or stored upon different types of computer-accessible media or on similar media separate from system memory 2220 or computer system 2200. While computer system 2200 is described as implementing the functionality of functional blocks of previous Figures, any of the functionality described herein may be implemented via such a computer system.
In one embodiment, I/O interface 2230 may be configured to coordinate I/O traffic between processor 2210, system memory 2220, and any peripheral devices in the device, including network interface 2240 or other peripheral interfaces, such as input/output devices 2250. In some embodiments, I/O interface 2230 may perform any necessary protocol, timing or other data transformations to convert data signals from one component (e.g., system memory 2220) into a format suitable for use by another component (e.g., processor 2210). In some embodiments, I/O interface 2230 may include support for devices attached through various types of peripheral buses, such as a variant of the Peripheral Component Interconnect (PCI) bus standard or the Universal Serial Bus (USB) standard, for example. In some embodiments, the function of I/O interface 2230 may be split into two or more separate components, such as a north bridge and a south bridge, for example. Also, in some embodiments some or all of the functionality of I/O interface 2230, such as an interface to system memory 2220, may be incorporated directly into processor 2210.
Network interface 2240 may be configured to allow data to be exchanged between computer system 2200 and other devices attached to a network 2285 (e.g., carrier or agent devices) or between nodes of computer system 2200. Network 2285 may in various embodiments include one or more networks including but not limited to Local Area Networks (LANs) (e.g., an Ethernet or corporate network), Wide Area Networks (WANs) (e.g., the Internet), wireless data networks, some other electronic data network, or some combination thereof. In various embodiments, network interface 2240 may support communication via wired or wireless general data networks, such as any suitable type of Ethernet network, for example; via telecommunications/telephony networks such as analog voice networks or digital fiber communications networks; via storage area networks such as Fibre Channel SANs, or via any other suitable type of network and/or protocol.
Input/output devices 2250 may, in some embodiments, include one or more display terminals, keyboards, keypads, touchpads, scanning devices, voice or optical recognition devices, or any other devices suitable for entering or accessing data by one or more computer systems 2200. Multiple input/output devices 2250 may be present in computer system 2200 or may be distributed on various nodes of computer system 2200. In some embodiments, similar input/output devices may be separate from computer system 2200 and may interact with one or more nodes of computer system 2200 through a wired or wireless connection, such as over network interface 2240.
As shown in
Those skilled in the art will appreciate that computer system 2200 is merely illustrative and is not intended to limit the scope of embodiments. In particular, the computer system and devices may include any combination of hardware or software that can perform the indicated functions, including computers, network devices, Internet appliances, PDAs, wireless phones, pagers, etc. Computer system 2200 may also be connected to other devices that are not illustrated, or instead may operate as a stand-alone system. In addition, the functionality provided by the illustrated components may in some embodiments be combined in fewer components or distributed in additional components. Similarly, in some embodiments, the functionality of some of the illustrated components may not be provided and/or other additional functionality may be available.
Those skilled in the art will also appreciate that, while various items are illustrated as being stored in memory or on storage while being used, these items or portions of them may be transferred between memory and other storage devices for purposes of memory management and data integrity. Alternatively, in other embodiments some or all of the software components may execute in memory on another device and communicate with the illustrated computer system via inter-computer communication. Some or all of the system components or data structures may also be stored (e.g., as instructions or structured data) on a computer-accessible medium or a portable article to be read by an appropriate drive, various examples of which are described above. In some embodiments, instructions stored on a computer-accessible medium separate from computer system 2400 may be transmitted to computer system 2400 via transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as a network and/or a wireless link. Various embodiments may further include receiving, sending or storing instructions and/or data implemented in accordance with the foregoing description upon a computer-accessible medium. Generally speaking, a computer-accessible medium may include a non-transitory, computer-readable storage medium or memory medium such as magnetic or optical media, e.g., disk or DVD/CD-ROM, volatile or non-volatile media such as RAM (e.g. SDRAM, DDR, RDRAM, SRAM, etc.), ROM, etc. In some embodiments, a computer-accessible medium may include transmission media or signals such as electrical, electromagnetic, or digital signals, conveyed via a communication medium such as network and/or a wireless link.
The methods described herein may be implemented in software, hardware, or a combination thereof, in different embodiments. In addition, the order of the blocks of the methods may be changed, and various elements may be added, reordered, combined, omitted, modified, etc. Various modifications and changes may be made as would be obvious to a person skilled in the art having the benefit of this disclosure. The various embodiments described herein are meant to be illustrative and not limiting. Many variations, modifications, additions, and improvements are possible. Accordingly, plural instances may be provided for components described herein as a single instance. Boundaries between various components, operations and data stores are somewhat arbitrary, and particular operations are illustrated in the context of specific illustrative configurations. Other allocations of functionality are envisioned and may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the example configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of embodiments as defined in the claims that follow.
Additional descriptions of embodiments:
CLAUSE 1: A camera, comprising:
Other allocations of functionality are envisioned and may fall within the scope of claims that follow. Finally, structures and functionality presented as discrete components in the example configurations may be implemented as a combined structure or component. These and other variations, modifications, additions, and improvements may fall within the scope of embodiments as defined in the claims that follow.
This application is a continuation-in-part of U.S. patent application Ser. No. 17/175,469, filed Feb. 12, 2021, which is a continuation of U.S. patent application Ser. No. 16/083,819, filed Sep. 10, 2018 and now issued as U.S. Pat. No. 10,924,675, which is a 371 of PCT Application No. PCT/US2017/021915, filed Mar. 10, 2017, which claims benefit of priority of U.S. Provisional Patent Application No. 62/307,416, filed Mar. 11, 2016, and U.S. Provisional to Application No. 62/399,095, filed Sep. 23, 2016, which are all hereby incorporated by reference in their entirety.
Number | Name | Date | Kind |
---|---|---|---|
7170690 | Ophey | Jan 2007 | B2 |
7612953 | Nagai et al. | Nov 2009 | B2 |
7630618 | Nomura | Dec 2009 | B2 |
7952612 | Kakkori | May 2011 | B2 |
RE42642 | Sato et al. | Aug 2011 | E |
8111295 | Makimoto et al. | Feb 2012 | B2 |
8248497 | Tanimura et al. | Aug 2012 | B2 |
8264549 | Tokiwa et al. | Sep 2012 | B2 |
8488260 | Calvet et al. | Jul 2013 | B2 |
8548313 | Krueger | Oct 2013 | B2 |
8749643 | Lim et al. | Jun 2014 | B2 |
8817393 | Kwon | Aug 2014 | B2 |
8866918 | Gregory et al. | Oct 2014 | B2 |
8908086 | Kawai | Dec 2014 | B2 |
8947544 | Kawai | Feb 2015 | B2 |
8998514 | Gutierrez et al. | Apr 2015 | B2 |
9298017 | Sugawara et al. | Mar 2016 | B2 |
9316810 | Mercado | Apr 2016 | B2 |
9578217 | Gutierrez et al. | Feb 2017 | B2 |
9632280 | Yeo | Apr 2017 | B2 |
9736345 | Topliss et al. | Aug 2017 | B1 |
9773169 | Fulmer | Sep 2017 | B1 |
9807305 | Gutierrez | Oct 2017 | B2 |
10257433 | Eromaki | Apr 2019 | B2 |
10863094 | Sharma et al. | Dec 2020 | B2 |
10890734 | Sharma et al. | Jan 2021 | B1 |
10924675 | Hubert et al. | Feb 2021 | B2 |
11122205 | Sharma | Sep 2021 | B1 |
11163141 | Yao et al. | Nov 2021 | B2 |
11223766 | Sharma et al. | Jan 2022 | B2 |
11575835 | Xu et al. | Feb 2023 | B2 |
11582388 | Hubert et al. | Feb 2023 | B2 |
11635597 | Yuhong et al. | Apr 2023 | B2 |
20010001588 | Matz | May 2001 | A1 |
20030160902 | Mihara et al. | Aug 2003 | A1 |
20030184878 | Tsuzuki | Oct 2003 | A1 |
20040105025 | Scherling | Jun 2004 | A1 |
20040257677 | Matsusaka | Dec 2004 | A1 |
20060017815 | Stavely et al. | Jan 2006 | A1 |
20070024739 | Konno | Feb 2007 | A1 |
20070070525 | Taniyama | Mar 2007 | A1 |
20070279497 | Wada et al. | Dec 2007 | A1 |
20090147340 | Lipton et al. | Jun 2009 | A1 |
20090179995 | Fukumoto et al. | Jul 2009 | A1 |
20090295986 | Topliss et al. | Dec 2009 | A1 |
20090296238 | Kakuta | Dec 2009 | A1 |
20100165131 | Makimoto et al. | Jul 2010 | A1 |
20100315521 | Kunishige et al. | Dec 2010 | A1 |
20110141294 | Lam | Jun 2011 | A1 |
20110141339 | Seo | Jun 2011 | A1 |
20110235194 | Nobe | Sep 2011 | A1 |
20120106936 | Lim et al. | May 2012 | A1 |
20120120512 | Wade et al. | May 2012 | A1 |
20120224075 | Lim | Sep 2012 | A1 |
20120268642 | Kawai | Oct 2012 | A1 |
20130107068 | Kim et al. | May 2013 | A1 |
20130119785 | Han | May 2013 | A1 |
20130250169 | Kim | Sep 2013 | A1 |
20140009631 | Topliss | Jan 2014 | A1 |
20140111650 | Georgiev et al. | Apr 2014 | A1 |
20140139695 | Kawai | May 2014 | A1 |
20140255016 | Kim et al. | Sep 2014 | A1 |
20140327965 | Chen | Nov 2014 | A1 |
20150042870 | Chan et al. | Feb 2015 | A1 |
20150051097 | Anderton et al. | Feb 2015 | A1 |
20150135703 | Eddington et al. | May 2015 | A1 |
20150195439 | Miller et al. | Jul 2015 | A1 |
20150253543 | Mercado | Sep 2015 | A1 |
20150253647 | Mercado | Sep 2015 | A1 |
20150316748 | Cheo et al. | Nov 2015 | A1 |
20150350499 | Topliss | Dec 2015 | A1 |
20150358528 | Brodie et al. | Dec 2015 | A1 |
20160041363 | Hagiwara | Feb 2016 | A1 |
20160070115 | Miller et al. | Mar 2016 | A1 |
20160072998 | Yazawa | Mar 2016 | A1 |
20160073028 | Gleason et al. | Mar 2016 | A1 |
20160154204 | Lim et al. | Jun 2016 | A1 |
20160154252 | Miller | Jun 2016 | A1 |
20160161828 | Lee | Jun 2016 | A1 |
20160097937 | Lam | Jul 2016 | A1 |
20160209672 | Park et al. | Jul 2016 | A1 |
20160327773 | Choi et al. | Nov 2016 | A1 |
20160360111 | Thivent et al. | Dec 2016 | A1 |
20170023781 | Wang et al. | Jan 2017 | A1 |
20170054883 | Sharma et al. | Feb 2017 | A1 |
20170082829 | Kudo | Mar 2017 | A1 |
20170108670 | Ko | Apr 2017 | A1 |
20170155816 | Ito et al. | Jun 2017 | A1 |
20170285362 | Hu et al. | Oct 2017 | A1 |
20170324906 | Kang et al. | Nov 2017 | A1 |
20170351158 | Kudo | Dec 2017 | A1 |
20170357076 | Scheele | Dec 2017 | A1 |
20180041668 | Cui | Feb 2018 | A1 |
20180048793 | Gross et al. | Feb 2018 | A1 |
20180171991 | Miller et al. | Jun 2018 | A1 |
20180173080 | Enta | Jun 2018 | A1 |
20190014258 | Horesh | Jan 2019 | A1 |
20190020822 | Sharma et al. | Jan 2019 | A1 |
20190041661 | Murakami | Feb 2019 | A1 |
20200314338 | Johnson | Oct 2020 | A1 |
20210132327 | Sharma et al. | May 2021 | A1 |
20210168289 | Hubert et al. | Jun 2021 | A1 |
20210223563 | Miller | Jul 2021 | A1 |
20210409604 | Sharma et al. | Dec 2021 | A1 |
20220050277 | Yuhong et al. | Feb 2022 | A1 |
20220094853 | Xu et al. | Mar 2022 | A1 |
20220124249 | Sharma et al. | Apr 2022 | A1 |
20230188852 | Xu et al. | Jun 2023 | A1 |
20230199313 | Mahmoudzadeh | Jun 2023 | A1 |
Number | Date | Country |
---|---|---|
1940628 | Apr 2007 | CN |
101808191 | Aug 2010 | CN |
102135656 | Jul 2011 | CN |
102749697 | Oct 2012 | CN |
103117637 | May 2013 | CN |
20150051097 | May 2015 | CN |
104767915 | Jul 2015 | CN |
104898352 | Sep 2015 | CN |
10502204 | Nov 2015 | CN |
105025657 | Nov 2015 | CN |
204903924 | Dec 2015 | CN |
105573014 | May 2016 | CN |
105652557 | Jun 2016 | CN |
105807537 | Jul 2016 | CN |
106291862 | Jan 2017 | CN |
106470303 | Mar 2017 | CN |
H10285475 | Oct 1998 | JP |
2006078854 | Mar 2006 | JP |
2008203402 | Sep 2008 | JP |
2011154403 | Aug 2011 | JP |
2011203476 | Oct 2011 | JP |
2013072967 | Apr 2013 | JP |
2013125080 | Jun 2013 | JP |
2015146040 | Aug 2015 | JP |
2016028299 | Feb 2016 | JP |
20100048361 | May 2010 | KR |
20160000759 | Jan 2016 | KR |
201114249 | Apr 2011 | TW |
201418863 | May 2014 | TW |
I438543 | May 2014 | TW |
2016011801 | Jan 2016 | WO |
2020069391 | Apr 2020 | WO |
Entry |
---|
International Search Report and Written Opinion from PCT/US2017/021915, dated Mar. 10, 2017, Skattward Research LLC, pp. 1-17. |
Office Action from Japanese Application No. 2018-548102, (English Translation and Japanese version), dated Aug. 30, 2019, pp. 1-8. |
Extended European Search Report and Written Opinion from Application No. 20171319.5-1020, dated Oct. 8, 2020, pp. 1-9. |
Office Action from Chinese Application No. 201780016528.X, dated May 27, 2020, (English translation and Chinese version), pp. 1-26. |
Notice of Eligibility for Grant and Examination Report from Singapore Application No. 11201807830U dated Mar. 1, 2021, pp. 1-5. |
Search Report and Written Opinion from Singapore Application No. 11201807830U dated Feb. 21, 2020, pp. 1-10. |
Official Action from Chinese Application No. 201780016528.X dated Apr. 7, 2021, (English translation and Chinese version), pp. 1-20. |
Office Action dated Aug. 4, 2021 in Japanese Patent Application No. 2020-092323, Apple Inc., pp. 1-4 (including translation). |
Office Action from Japanese Application No. 2020-092323, dated Oct. 13, 2022, pp. 1-4. |
U.S. Appl. No. 17/542,252, filed Dec. 3, 2021, Yuhong Yao. |
U.S. Appl. No. 18/353,805, filed Jul. 17, 2023, Shashank Sharma, et al. |
Warren J. Smith, “Modern Lens Design,” In: Modern Lens Design, Jan. 1, 1992, McGraw-Hill, Inc., XP055152035, ISBN: 978-0-07-059178-3, pp. 25-27. |
Number | Date | Country | |
---|---|---|---|
20220247931 A1 | Aug 2022 | US |
Number | Date | Country | |
---|---|---|---|
62399095 | Sep 2016 | US | |
62307416 | Mar 2016 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16083819 | US | |
Child | 17175469 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17175469 | Feb 2021 | US |
Child | 17719287 | US |