This relates generally to straps, and, more particularly, to straps with snap and slider mechanisms.
It is sometimes desirable to provide items such as electronic devices with straps. Straps may allow devices to be worn or carried by a user.
An adjustable-length strap may have a strip of material. The strip of material may have exterior surfaces of leather, fabric, or other material and may include one more internal layers such as a flexible magnet layer.
The strap may have a snap mechanism that removably couples the strap to an item such as an electronic device and/or a removable cover for an electronic device. The snap mechanism may have snaps coupled to the strip of material. There may be, as examples, two snaps or three snaps in the snap mechanism.
The strip may have left and right portions that are coupled together using sliders and the snap mechanism. The sliders may be fixedly attached to respective end portions of the left and right strip portions. A first of the sliders may have a slot that allows the first slider to slide along the right strip portion and a second of the sliders may have a slot that allows the second slider to slide along the left strip portion so that the length of the strap may be adjusted.
Electronic devices and other items may be provided with straps. For example, a cellular telephone or other electronic device may be provided with a fixed-length or adjustable-length carrying strap. Straps may be held in a user's hands, worn about a user's neck, worn across a user's body, and/or otherwise carried by a user. In some configurations, an electronic device may be provided with a strap that facilitates mounting of the device on the wrist, arm, leg, head, or other body part of a user. For example, a wristwatch may be provided with an adjustable-length wrist strap.
Item 12 may include an electronic device such as electronic device 16 (e.g., a battery pack, a cellular telephone, a tablet computer, other electronic equipment, etc.) and/or may include a carrying case that is removably attached to device 16 such as removable case 14 (e.g., a removable cover formed from polymer, leather, fabric, etc.). If desired, removable case 14 may incorporate batteries and other circuitry. Device 16 may include a display, buttons, touch sensors, force sensors, optical sensors, microphones for gathering voice input, and/or other sensors and input-output devices for gathering user input and providing a user with output. The user input may be used in controlling the operation of device 16. Carrying case 14 and/or electronic device 16 may be used as stand-alone equipment or may, if desired, be tethered to a head-mounted device or other additional electronic equipment (e.g., additional electronic equipment with input-output devices for receiving user input, for providing a user with output, etc.). When item 12 is used with additional electronic equipment, wired and/or wireless power paths and wired and/or wireless data communications paths may be used to transfer power and/or data between item 12 and the additional electronic equipment.
A snap mechanism such as snap mechanism 20 may be used to secure the ends of strip 18. Snap mechanism 20 may have on or more snap elements (sometimes referred to as snaps or snap members) that are detachably snapped together to secure strap 10 to an item (e.g. item 12 of
Strip 18 may be formed from a uniform length of material (with one or more sublayers) and/or different segments along the length of strip 18 may have different internal and/or external layer(s) of material. As an example, end portion 22 of strip 18 may have an exterior surface formed from fabric, whereas remaining portions of strip 18 may have an exterior surface formed from leather. If desired, all of strip 18 may be leather or all of strip 18 may be formed from fabric. Polymer layers and/or other materials may also be used to cover some or all of strip 18. For example, the surface of a portion of strip 18 may be covered with a layer of polymer or other material that is not present on other portions of strip 18.
As shown in
Strap 10 may, if desired, have adjustable clasps. The adjustable clasps, which may sometimes be referred to as adjustable sliders or sliding clasps, may allow the length of strap 10 to be adjusted. Consider, as an example, the arrangement of
In the example of
With this arrangement, attachment mechanism 30R holds the end of strip 18L in place on slider 28R, while the slot in slider 28R allows slider 28R to slide along the length of strip 18R. Attachment mechanism 30L holds the end of strip 18R in place on slider 28L, while the slot in slider 28L allows slider 28L to slide along the length of strip 18L. In this way, the separation distance L between sliders 28 along strap 10 may be adjusted. To shorten strap 10, slider 28L and/or slider 28R is moved along strip 18 towards item 12 (e.g., sliders 28 are moved apart to increase L and reduce the size of loop 26). To lengthen strap 10 and increase the size of loop 26, sliders 28 are moved towards each other, which decreases L and increases the size of loop 26. If desired, flexible magnetic structures may be embedded within some or all of strip 18 (e.g., at least in the portion of strip 18 between sliders 28) to help hold strips 18L and 18R next to each other (e.g., to reduce tangling).
As described in connection with
An illustrative three-element snap mechanism is shown in
The middle snap in mechanism 20 (snap F1) has first portion FIT and second portion FIL, which are press fit together in an opening in strip 18L to attach snap F1 to strip 18L. A ring member such as polymer ring 36 may be mounted in the center of snap F1 to help reduce binding and/or rattling between snap F1 and the other snaps of mechanism 20 so that snap F1 may smoothly and quietly rotate relative to protrusion MLP. Snap F2 has first portion F2T and second portion F2L, which are press-fit together to secure snap F2 within an opening in strip 18L. The openings in snaps F1 and F2 receive protrusion MLP of snap M along axis 40 when it is desired to close snap mechanism 20 by snapping together snaps M, F1, and F2. Ring-shaped member 38 in snap F2 (e.g., a ring of metal, polymer, etc.) may be used to create friction with protrusion MLP, thereby helping to hold snap mechanism 20 in its closed position.
Sliders 28 may be formed from one or more structures joined together using press-fit connections, adhesive, fasteners, welds, and/or other attachment mechanisms. An exploded perspective view of an illustrative slider 28 is shown in
Body 28B has a through-hole opening such as though slot 42 that receives a portion of strip 18 (e.g., portion 18A) for sliding motion (e.g., slot 42 receives strip portion 18A while allowing that portion of strip 18 to slide with respect to slider 28). Slider 28 may be configured to provide friction in slot 42 so that slider 28 is maintained in place on strip 18 until deliberately moved by a user to adjust the length of strap 10.
Body 28B also has a non-through-hole opening such as slot 44. The end of a portion of strip 18 such as portion 18B may pass through slot 44. A loop or other structure in the end portion of strip 18 that passes through slot 44 into the interior of body 28B may receive a strip retention member such as strip retention pin 46. Pin 46 may be mounted into recesses in body 28B or other pin retention structures in the interior of body 28B through opening 50 in body 28B. Cap 52 may then be press fit into opening 50 to cover opening 50 and thereby close body 28B. With this type of arrangement, pin 46 and the corresponding loop at the end of strip portion 18A form a fixed slider attachment mechanism (see, e.g., attachment mechanisms 30L and 30R of
Strip 18 may include one or more layers (sometimes referred to as strip-shaped layers, strips, elongated layers, strip layers, band layers, strap layers, etc.). These layers are used in providing strip 18 with desired properties. As an example, strip 18 may have a strengthening layer such as strengthening layer 60 (e.g., a layer of fabric, polymer, etc.) that is looped around pin 46 to fixedly attach strip 18 to pin 46 as shown in
As shown in
The layers of material forming strip 18 may include fabric layers (e.g., thin sheets of fabric and/or fabric loops that are folded to form doubled-up fabric layers), polymer layers, layers of thin bendable metal, layers with magnetic material (e.g., magnetic particles embedded in flexible polymer binder to form a flexible magnet such as an elongated strip-shaped flexible magnet), adhesive layers, composite materials (e.g., polymer binder with embedded flexible strands of material such as polymer yarn, fiber glass strands, metal strands, etc.), layers formed form natural materials such as cotton, leather, wool, bamboo, and/or other natural materials, and/or layers of other materials, and/or combinations of these materials. In some configurations, the outermost layers of material on strip 18 (e.g. layers on the upper and lower opposing surfaces of strip 18) may be formed from materials that resist wear and/or have a desired cosmetic appearance. As an example, some or all of the outermost layers of strip 18 may be formed from materials such as leather, fabric, and/or polymer.
If desired, a magnetic layer may be included in the layers of strip 18. Magnetic layers (e.g., magnets) may attract one portion of strip 18 to another. For example, an elongated strip-shaped flexible magnet may be embedded in the core of strip 18 so that overlapping portions of strip 18 (e.g., strips 18R and 18L of
In general, any suitable layers may be included in strip 18 (e.g., magnetic layers, strengthening layers, layers that adjust the stiffness of strip 18, layers of adhesive to attach other layers together, layers to adjust strap thickness and/or weight, etc.). In the illustrative configuration of
The layers used in forming illustrative strip 18 of
If desired, one or more of the layers of strip 18 may include fabric. A strip-shaped sheet of fabric may be provided. If desired, a woven, knit, or braided tube of fabric may be used in forming one or more layers in strip 18. For example, layer 88 and/or layer 90 may each be formed by a collapsed tube of fabric such as fabric tube 92 of
In some embodiments, a tube of fabric may be used to enclose other layers of material for strip 18. This type of arrangement is shown in
Sliders 28 may be formed from one or more structures that are press fit together and/or are otherwise joined. In the example of
As shown in the cross-sectional side view of member 28A of
As shown in
In the illustrative configuration for slider 28 that is shown in
Spring 136 may press button member 132 in direction 138. This causes cam surface 142 of member 132 to bear against pin 140, forcing portion 150 of member 132 in direction 148 and thereby squeezing slot 146 about the strip in slot 146. This holds slider 28 in place against strip 18. When it is desired to release strip 18 from slot 146, a user may press button 132 in direction 152, which relieves cam pressure from portion 150 and allows portion 150 to move in direction 154, thereby reducing friction on strip 18.
In the illustrative configuration of
Slider 28 may, if desired, include biasing structures such as spring 174. In the example of
In some embodiments, slider 28 may be formed from an outer shell that is filled with polymer or other filler material that forms a core for the shell. As shown in
If desired, strap 10 may be coupled to items that are worn on a user's wrist or other body part. Consider, as an example, the illustrative configuration of
One end of strip 18 may be provided with an adjustable-length loop such as loop 186. The tip of strip 18 at this end of strip 18 may be fixedly attached to slider 28 at fixed attachment point 188. Slider 28 may have a slot that allows slider 28 to slide along the length of strip 18. When a user desires to adjust the size of loop 190 (e.g., to loosen or tighten strap 10 so that strap 10 and device 16 may fit comfortably on the user's wrist), slider 28 may be moved towards device 16 or away from device 16 to adjust the separation distance D between slider 28 and device 16 and thereby adjust the size of loop 186 and the length of strap 10.
As described above, one aspect of the present technology is the gathering and use of information such as information from input-output devices. The present disclosure contemplates that in some instances, data may be gathered that includes personal information data that uniquely identifies or can be used to contact or locate a specific person. Such personal information data can include demographic data, location-based data, telephone numbers, email addresses, twitter ID's, home addresses, data or records relating to a user's health or level of fitness (e.g., vital signs measurements, medication information, exercise information), date of birth, username, password, biometric information, or any other identifying or personal information.
The present disclosure recognizes that the use of such personal information, in the present technology, can be used to the benefit of users. For example, the personal information data can be used to deliver targeted content that is of greater interest to the user. Accordingly, use of such personal information data enables users to have control of the delivered content. Further, other uses for personal information data that benefit the user are also contemplated by the present disclosure. For instance, health and fitness data may be used to provide insights into a user's general wellness, or may be used as positive feedback to individuals using technology to pursue wellness goals.
The present disclosure contemplates that the entities responsible for the collection, analysis, disclosure, transfer, storage, or other use of such personal information data will comply with well-established privacy policies and/or privacy practices. In particular, such entities should implement and consistently use privacy policies and practices that are generally recognized as meeting or exceeding industry or governmental requirements for maintaining personal information data private and secure. Such policies should be easily accessible by users, and should be updated as the collection and/or use of data changes. Personal information from users should be collected for legitimate and reasonable uses of the entity and not shared or sold outside of those legitimate uses. Further, such collection/sharing should occur after receiving the informed consent of the users. Additionally, such entities should consider taking any needed steps for safeguarding and securing access to such personal information data and ensuring that others with access to the personal information data adhere to their privacy policies and procedures. Further, such entities can subject themselves to evaluation by third parties to certify their adherence to widely accepted privacy policies and practices. In addition, policies and practices should be adapted for the particular types of personal information data being collected and/or accessed and adapted to applicable laws and standards, including jurisdiction-specific considerations. For instance, in the United States, collection of or access to certain health data may be governed by federal and/or state laws, such as the Health Insurance Portability and Accountability Act (HIPAA), whereas health data in other countries may be subject to other regulations and policies and should be handled accordingly. Hence different privacy practices should be maintained for different personal data types in each country.
Despite the foregoing, the present disclosure also contemplates embodiments in which users selectively block the use of, or access to, personal information data. That is, the present disclosure contemplates that hardware and/or software elements can be provided to prevent or block access to such personal information data. For example, the present technology can be configured to allow users to select to “opt in” or “opt out” of participation in the collection of personal information data during registration for services or anytime thereafter. In another example, users can select not to provide certain types of user data. In yet another example, users can select to limit the length of time user-specific data is maintained. In addition to providing “opt in” and “opt out” options, the present disclosure contemplates providing notifications relating to the access or use of personal information. For instance, a user may be notified upon downloading an application (“app”) that their personal information data will be accessed and then reminded again just before personal information data is accessed by the app.
Moreover, it is the intent of the present disclosure that personal information data should be managed and handled in a way to minimize risks of unintentional or unauthorized access or use. Risk can be minimized by limiting the collection of data and deleting data once it is no longer needed. In addition, and when applicable, including in certain health related applications, data de-identification can be used to protect a user's privacy. De-identification may be facilitated, when appropriate, by removing specific identifiers (e.g., date of birth, etc.), controlling the amount or specificity of data stored (e.g., collecting location data at a city level rather than at an address level), controlling how data is stored (e.g., aggregating data across users), and/or other methods.
Therefore, although the present disclosure broadly covers use of information that may include personal information data to implement one or more various disclosed embodiments, the present disclosure also contemplates that the various embodiments can also be implemented without the need for accessing personal information data. That is, the various embodiments of the present technology are not rendered inoperable due to the lack of all or a portion of such personal information data.
Physical environment: A physical environment refers to a physical world that people can sense and/or interact with without aid of electronic systems. Physical environments, such as a physical park, include physical articles, such as physical trees, physical buildings, and physical people. People can directly sense and/or interact with the physical environment, such as through sight, touch, hearing, taste, and smell.
Computer-generated reality: in contrast, a computer-generated reality (CGR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic system. In CGR, a subset of a person's physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the CGR environment are adjusted in a manner that comports with at least one law of physics. For example, a CGR system may detect a person's head turning and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), adjustments to characteristic(s) of virtual object(s) in a CGR environment may be made in response to representations of physical motions (e.g., vocal commands). A person may sense and/or interact with a CGR object using any one of their senses, including sight, sound, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create 3D or spatial audio environment that provides the perception of point audio sources in 3D space. In another example, audio objects may enable audio transparency, which selectively incorporates ambient sounds from the physical environment with or without computer-generated audio. In some CGR environments, a person may sense and/or interact only with audio objects. Examples of CGR include virtual reality and mixed reality.
Virtual reality: A virtual reality (VR) environment refers to a simulated environment that is designed to be based entirely on computer-generated sensory inputs for one or more senses. A VR environment comprises a plurality of virtual objects with which a person may sense and/or interact. For example, computer-generated imagery of trees, buildings, and avatars representing people are examples of virtual objects. A person may sense and/or interact with virtual objects in the VR environment through a simulation of the person's presence within the computer-generated environment, and/or through a simulation of a subset of the person's physical movements within the computer-generated environment.
Mixed reality: In contrast to a VR environment, which is designed to be based entirely on computer-generated sensory inputs, a mixed reality (MR) environment refers to a simulated environment that is designed to incorporate sensory inputs from the physical environment, or a representation thereof, in addition to including computer-generated sensory inputs (e.g., virtual objects). On a virtuality continuum, a mixed reality environment is anywhere between, but not including, a wholly physical environment at one end and virtual reality environment at the other end. In some MR environments, computer-generated sensory inputs may respond to changes in sensory inputs from the physical environment. Also, some electronic systems for presenting an MR environment may track location and/or orientation with respect to the physical environment to enable virtual objects to interact with real objects (that is, physical articles from the physical environment or representations thereof). For example, a system may account for movements so that a virtual tree appears stationery with respect to the physical ground. Examples of mixed realities include augmented reality and augmented virtuality. Augmented reality: an augmented reality (AR) environment refers to a simulated environment in which one or more virtual objects are superimposed over a physical environment, or a representation thereof. For example, an electronic system for presenting an AR environment may have a transparent or translucent display through which a person may directly view the physical environment. The system may be configured to present virtual objects on the transparent or translucent display, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. Alternatively, a system may have an opaque display and one or more imaging sensors that capture images or video of the physical environment, which are representations of the physical environment. The system composites the images or video with virtual objects, and presents the composition on the opaque display. A person, using the system, indirectly views the physical environment by way of the images or video of the physical environment, and perceives the virtual objects superimposed over the physical environment. As used herein, a video of the physical environment shown on an opaque display is called “pass-through video,” meaning a system uses one or more image sensor(s) to capture images of the physical environment, and uses those images in presenting the AR environment on the opaque display. Further alternatively, a system may have a projection system that projects virtual objects into the physical environment, for example, as a hologram or on a physical surface, so that a person, using the system, perceives the virtual objects superimposed over the physical environment. An augmented reality environment also refers to a simulated environment in which a representation of a physical environment is transformed by computer-generated sensory information. For example, in providing pass-through video, a system may transform one or more sensor images to impose a select perspective (e.g., viewpoint) different than the perspective captured by the imaging sensors. As another example, a representation of a physical environment may be transformed by graphically modifying (e.g., enlarging) portions thereof, such that the modified portion may be representative but not photorealistic versions of the originally captured images. As a further example, a representation of a physical environment may be transformed by graphically eliminating or obfuscating portions thereof. Augmented virtuality: an augmented virtuality (AV) environment refers to a simulated environment in which a virtual or computer generated environment incorporates one or more sensory inputs from the physical environment. The sensory inputs may be representations of one or more characteristics of the physical environment. For example, an AV park may have virtual trees and virtual buildings, but people with faces photorealistically reproduced from images taken of physical people. As another example, a virtual object may adopt a shape or color of a physical article imaged by one or more imaging sensors. As a further example, a virtual object may adopt shadows consistent with the position of the sun in the physical environment.
Hardware: there are many different types of electronic systems that enable a person to sense and/or interact with various CGR environments. Examples include head mounted systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person's eyes (e.g., similar to contact lenses), headphones/earphones, speaker arrays, input systems (e.g., wearable or handheld controllers with or without haptic feedback), smartphones, tablets, and desktop/laptop computers. A head mounted system may have one or more speaker(s) and an integrated opaque display. Alternatively, a head mounted system may be configured to accept an external opaque display (e.g., a smartphone). The head mounted system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mounted system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person's eyes. The display may utilize digital light projection, OLEDs, LEDs, uLEDs, liquid crystal on silicon, laser scanning light sources, or any combination of these technologies. The medium may be an optical waveguide, a hologram medium, an optical combiner, an optical reflector, or any combination thereof. In one embodiment, the transparent or translucent display may be configured to become opaque selectively. Projection-based systems may employ retinal projection technology that projects graphical images onto a person's retina. Projection systems also may be configured to project virtual objects into the physical environment, for example, as a hologram or on a physical surface.
The foregoing is merely illustrative and various modifications can be made to the described embodiments. The foregoing embodiments may be implemented individually or in any combination.
This application claims the benefit of U.S. provisional patent application No. 63/108,960, filed Nov. 3, 2020, which is hereby incorporated by reference herein in its entirety.
Number | Name | Date | Kind |
---|---|---|---|
533373 | Presby | Jan 1895 | A |
1270494 | Christiansen | Jun 1918 | A |
2121513 | Smith | Jun 1938 | A |
3835505 | Shewbridge | Sep 1974 | A |
3848270 | Rand | Nov 1974 | A |
3977638 | Woodard | Aug 1976 | A |
4881746 | Andreesen | Nov 1989 | A |
4937885 | Gregg | Jul 1990 | A |
4982885 | Severson | Jan 1991 | A |
6449815 | Spiller | Sep 2002 | B1 |
7004113 | Zutis et al. | Feb 2006 | B1 |
7343649 | Walters, Jr. | Mar 2008 | B2 |
9407743 | Hirshberg | Aug 2016 | B1 |
9730499 | Ford | Aug 2017 | B2 |
10071620 | Hunter | Sep 2018 | B2 |
10132340 | Russell-Clarke et al. | Nov 2018 | B2 |
20060168766 | Lippincott | Aug 2006 | A1 |
20100064485 | Blevins | Mar 2010 | A1 |
20120192384 | Kelly | Aug 2012 | A1 |
20150173476 | Beltrán | Jun 2015 | A1 |
20160255944 | Baranski et al. | Sep 2016 | A1 |
20220346531 | Henrot | Nov 2022 | A1 |
Number | Date | Country | |
---|---|---|---|
63108960 | Nov 2020 | US |