The present disclosure is in the technical field of autonomous vehicle sensors and navigation. More particularly, the present disclosure is directed to adapting proximity sensors to be useful in detecting objects around the autonomous vehicle for use in controlling operation of the autonomous vehicle.
Autonomous vehicles have the ability to minimize the human effort involved in performing everyday tasks. For example, autonomous vehicles may be used as cleaning devices to help maintain and clean surfaces, such as hardwood floors, carpets, and the like. While autonomous vehicles are useful, it can be challenging for autonomous vehicles to operate in a variety of different locations. This challenge is especially the case when an autonomous vehicle operates in an environment with a number of different types of objects and those types of objects are not always accurately detectable by the same arrangement of sensors located on the autonomous vehicle.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one embodiment, a method uses an autonomous vehicle that is configured to move across a floor surface in an environment. The autonomous vehicle includes a proximity sensor that is positionable at different heights on the autonomous vehicle. The method includes determining a location of the autonomous vehicle within the environment and determining a proximity sensor height based on the location of the autonomous vehicle within the environment. The method further includes positioning the proximity sensor at a height on the autonomous vehicle based on the proximity sensor height and receiving, from the proximity sensor, a signal indicative of a distance to an object within the environment at the height of the proximity sensor on the autonomous vehicle.
In one example, determining the location, determining the proximity sensor height, positioning the proximity sensor, and receiving the signal are performed by the autonomous vehicle. In another example, determining the location, determining the proximity sensor height, positioning the proximity sensor, and receiving the signal are performed by at least one of the autonomous vehicle and a remote computing device. In another example, each of the autonomous vehicle and the remote computing device performs at least one of determining the location, determining the proximity sensor height, positioning the proximity sensor, and receiving the signal. In another example, the autonomous vehicle is communicatively coupled to the remote computing device via a network. In another example, the method further includes controlling an operation of the autonomous vehicle based on the signal indicative of the distance to the object. In another example, the operation of the autonomous vehicle includes at least one of a position of the autonomous vehicle, an orientation of the autonomous vehicle, a speed of the autonomous vehicle, or an acceleration of the autonomous vehicle. In another example, the operation of the autonomous vehicle includes one or more of navigation relative to the object in the environment or object avoidance of the object in the environment.
In another example, the proximity sensor is positionable at a distinct number of different heights on the autonomous vehicle. In another example, the proximity sensor is positionable at any height within a range of heights on the autonomous vehicle. In another example, positioning the proximity sensor is performed while the autonomous vehicle is moving across the floor surface in the environment. In another example, determining the proximity sensor height based on the location of the autonomous vehicle within the environment includes determining that the location of the autonomous vehicle does not have a pre-associated proximity sensor height and determining the proximity sensor height using sensor readings from an on-board sensor. In another example, the proximity sensor is the on-board sensor and determining the proximity sensor height includes moving the proximity sensor to a number of the different heights and selecting one of the number of the different heights as the proximity sensor height based on readings of the proximity sensor at the number of the different heights. In another example, the location of the autonomous vehicle that does not have a preassociated proximity sensor height is an unknown location or an unmapped location.
In another embodiment a system includes an autonomous vehicle, a location element, a proximity sensor, and a movement mechanism. The autonomous vehicle is configured to move across a floor surface of an environment. The location element is configured to determine a location of the autonomous vehicle within the environment. The proximity sensor is coupled to the autonomous vehicle. The movement mechanism is configured to position the proximity sensor at different heights on the autonomous vehicle. The movement mechanism is configured to position the proximity sensor in response to receiving instructions based on a proximity sensor height, and the proximity sensor height is determined based on the location of the autonomous vehicle determined by the location element. The proximity sensor is configured to generate a signal indicative of a distance to an object within the environment at the height of the proximity sensor on the autonomous vehicle.
In one example, the autonomous vehicle further comprises at least one processing element and at least one memory having instructions stored therein, and the instructions, in response to execution by the at least one processing element, cause the autonomous vehicle to determine the proximity sensor height based on the location of the autonomous vehicle determined by the location element and instruct the movement mechanism to position the proximity sensor based on the proximity sensor height. In another example, the instructions, in response to execution by the at least one processing element, further cause the autonomous vehicle to control an operation of the autonomous vehicle based on the signal indicative of the distance to the object. In another example, the operation of the autonomous vehicle includes at least one of a position of the autonomous vehicle, an orientation of the autonomous vehicle a speed of the autonomous vehicle, or an acceleration of the autonomous vehicle. In another example, the operation of the autonomous vehicle includes one or more of navigation relative to the object in the environment or object avoidance of the object in the environment.
In another example, the system further includes a housing having at least one aperture and the proximity sensor is configured to direct a field through the at least one aperture toward the object in the environment. In another example, the at least one aperture includes a plurality of apertures, and wherein the movement mechanism is configured to selectively position the proximity sensor at one of the plurality of apertures. In another example, the at least one aperture includes an elongated aperture, and the movement mechanism is configured to selectively position the proximity sensor at any height between a lower end of the elongated aperture and an upper end of the elongated aperture. In another example, the system further includes a remote computing device communicatively coupled to the autonomous vehicle via a network and the remote computing device is configured to perform at least one of determining the location of the autonomous vehicle, determining the proximity sensor height based on the location of the autonomous vehicle, or instructing the movement mechanism to position the proximity sensor based on the proximity sensor height.
In another embodiment, a non-transitory computer-readable medium has instructions embodied thereon for using an autonomous vehicle. The autonomous vehicle is configured to move across a floor surface in an environment and the autonomous vehicle includes a proximity sensor that is positionable at different heights on the autonomous vehicle. The instructions, in response to execution by a processing element in the autonomous vehicle, cause the autonomous vehicle to determine a location of the autonomous vehicle within the environment, determine a proximity sensor height based on the location of the autonomous vehicle within the environment, position the proximity sensor at a height on the autonomous vehicle based on the proximity sensor height, and receive, from the proximity sensor, a signal indicative of a distance to an object within the environment at the height of the proximity sensor on the autonomous vehicle.
In one example, the instructions, in response to execution by the processing element in the autonomous vehicle, further cause the autonomous vehicle to control an operation of the autonomous vehicle based on the signal indicative of the distance to the object. In another example, the operation of the autonomous vehicle includes at least one of a position of the autonomous vehicle, an orientation of the autonomous vehicle, a speed of the autonomous vehicle, or an acceleration of the autonomous vehicle. In another example, the operation of the autonomous vehicle includes one or more of navigation relative to the object in the environment or object avoidance of the object in the environment. In another example, the proximity sensor is positionable at a distinct number of different heights on the autonomous vehicle. In another example, the proximity sensor is positionable at any height within a range of heights on the autonomous vehicle. In another example, the instructions to position the proximity sensor cause the proximity sensor to be positioned while the autonomous vehicle is moving across the floor surface in the environment.
The foregoing aspects and many of the attendant advantages of the disclosed subject matter will become more readily appreciated as the same become better understood by reference to the following detailed description, when taken in conjunction with the accompanying drawings, wherein:
The present disclosure describes embodiments of using a proximity sensor on an autonomous vehicle. In some embodiments, autonomous vehicles move across floor surfaces in environments. While moving across the floor surfaces of the environments, the autonomous vehicles may encounter a number of different objects. Some of the objects can be used for navigation relative to the object by controlling the movement of the autonomous vehicle based on sensor readings of distances to the objects. The detection of objects in an environment can be used for object avoidance so that the autonomous vehicle does not collide with the objects. However, such objects can be difficult to detect because of their differing shapes and sizes.
In some embodiments disclosed herein, an autonomous vehicle includes a proximity sensor that is positionable on the autonomous vehicle at different heights. A location of the autonomous vehicle within the environment is determined. A proximity sensor height is determined based on the location of the autonomous vehicle within the environment. The proximity sensor is positioned at a height on the autonomous vehicle based on the proximity sensor height. A signal is received from the proximity sensor where the signal is indicative of a distance to an object within the environment at the height of the proximity sensor on the autonomous vehicle. An operation of the autonomous vehicle can be controlled based on the signal indicative of the distance to the object.
The autonomous vehicle 102 may be capable of navigating through the environment 100 to follow particular routes. In the embodiment depicted in
One difficulty with navigation of autonomous vehicles is the ability of the autonomous vehicle to maintain particular directions as they travel within an environment. One method of maintaining a particular direction with respect to walls is the use of a proximity sensor. In the embodiments depicted in
In the embodiments depicted in
Depicted in
In the area shown in
In the area shown in
In the area shown in
In the embodiments described above, the proximity sensors on the autonomous vehicles are positioned such that the fields emitted by the proximity sensors extend substantially perpendicular to the autonomous vehicles' direction of travel. This arrangement may permit the autonomous vehicles to use objects to the side of the autonomous vehicles as a guide for navigation. In other embodiments, proximity sensors on autonomous vehicles are positioned such that the fields are emitted by the proximity sensors in other directions and can be used for other purposes. Depicted in
Depicted in
In the area shown in
In the area shown in
In the area shown in
As shown above, a proximity sensor mounted to a side of an autonomous vehicle may be useful for navigating, including following an offset from a fixed object and avoidance of movable objects. However, these solutions have drawbacks in that proximity sensors fixedly mounted to autonomous vehicles do not provide accurate readings in all environments. It would be advantageous to have a way to accommodate for objects at a variety of different heights with respect to floor surfaces.
In the depicted embodiment, the apertures 424 include five distinct apertures. The proximity sensor 414 is capable of being positioned to emit a field through any one of the apertures 424. In the embodiment shown in
In the depicted embodiment, the aperture 424′ is an elongated aperture. The proximity sensor 414′ is capable of being positioned at any number of positions within the aperture 424′ to emit a field through the aperture 424′. In the embodiment shown in
The ability to move a proximity sensor on an autonomous vehicle to different heights can be particularly useful if controlled based on the location of the autonomous vehicle within an environment. This benefit can be obtained regardless of whether the proximity sensor is positionable at a distinct number of heights (e.g., the variable-height proximity sensor system 422 on autonomous vehicle 402) or at any number of different heights (e.g., the variable-height proximity sensor system 422′ on autonomous vehicle 402′). Examples of the benefits of this ability are depicted in
Depicted in
The grocery store environment 500 includes a number of fixtures placed on the floor surface 504. The grocery store environment 500 includes shelves 522 and shelves 524. The shelves 522 and 524 are spaced apart to form aisles between neighboring ones of the shelves 522 and 524. The grocery store environment 500 also includes produce islands 526 and produce display shelves 528 that are located to the left of the shelves 524. The grocery store environment 500 also includes bakery display case 530, bakery display tables 532, and bakery corner display 534. The grocery store environment 500 also includes checkout stands 536. Checkout endcap shelves 538 are located next to the checkout stands 536 and a railing 540 provides a barrier between the checkout stands 536 and the shelves 524.
In the depicted embodiment, the autonomous vehicles 502 are moving across the floor surface 504 in the grocery store environment 500. The autonomous vehicle 5021 is moving along a route 5101 that is at an offset from one of the produce islands 526. The autonomous vehicle 5022 is moving along a route 5102 that is at an offset from one of the shelves 524. The autonomous vehicle 5023 is moving along a route 5103 that passes between the checkout stands 536 at an offset from one of the checkout stands 536, turns along an end of and at an offset from one of the checkout stands 536, and then turns along the railing 540 at an offset from the railing 540. A rear view of each of the autonomous vehicle 5021, autonomous vehicle 5022, and autonomous vehicle 5023 is depicted, respectively, in
In
In
In
The shape of the bumper 548 may not permit the variable-height proximity sensor 5143 to get an accurate reading of the location of the bumper 548 and the bumper 548 may not extend continuously around the checkout stand 536. The variable-height proximity sensor 5143 has been positioned at a height with respect to the floor surface 504 so that the field 5163 is above the bumper 548 and impinges on the checkout stand 536. This allows the variable-height proximity sensor 5143 to get a reliable reading of the distance to the checkout stand 536 and the autonomous vehicle 5023 to control its operation based on a signal from the variable-height proximity sensor 5143 in order to follow the route 5103.
As can be seen in
While the grocery store environment 500 has been depicted in
Depicted in
The autonomous vehicle 602 also includes memory 608 configured to store information. The memory 608 is communicatively coupled to the processing element 604. In some embodiments, the memory 608 includes information about proximity sensor heights based on areas within an environment. For example, using the embodiment of the grocery store environment 500 show in
The autonomous vehicle 602 also includes a movement mechanism 612 and a proximity sensor 614. Each of the movement mechanism 612 and the proximity sensor 614 is communicatively coupled to the processing element 604. The movement mechanism 612 is coupled to the proximity sensor 614 such that the movement mechanism 612 is configured to move the proximity sensor 614 to change its height with respect to a floor surface upon which the autonomous vehicle 602 moves. The processing element 604 is capable of instructing the movement mechanism 612 to change the height of the proximity sensor 614. The proximity sensor 614 is configured to send signals to the processing element 604 indicative of distances from the proximity sensor 614 to objects in an environment. In one embodiment, after the processing element 604 identifies a proximity sensor height for the location of the autonomous vehicle 602, the processing element 604 controls the height of the proximity sensor 614 by instructing the movement mechanism 612 to change the height of the proximity sensor 614 based on the proximity sensor height.
The autonomous vehicle 602 also includes operation elements 616. The operation elements 616 are configured to control operation of the autonomous vehicle 602 within the environment. Examples of operation of the autonomous vehicle 602 include a position of the autonomous vehicle 602, an orientation of the autonomous vehicle 602, a speed of the autonomous vehicle 602, an acceleration of the autonomous vehicle 602, a floor cleaning by the autonomous vehicle 602, and the like. The operation elements 616 are communicatively coupled to the processing element 604. In one embodiment, after the processing element 604 receives a signal indicative of a distance from the proximity sensor 614 to objects in an environment, the processing element 604 controls one or more of the operation elements 616 based on the signal indicative of the distance from the proximity sensor 614 to objects in the environment. For example, the processing element 604 may control the operation elements 616 to affect the direction and/or the speed of the autonomous vehicle 602 in the environment.
In the embodiment depicted in
As depicted in
In the system depicted in
Depicted in
At block 704, a proximity sensor height is determined based on the location of the autonomous vehicle. In some embodiments, the autonomous vehicle determines the proximity sensor height by identifying the proximity sensor height in a memory, such as a lookup table that includes various proximity sensor heights for different locations of the autonomous vehicle. In some embodiments, the autonomous vehicle determines the proximity sensor height based on sensor readings of the environment, such as determining a particular proximity sensor height based on a three-dimensional scan of the environment. In some embodiments, a remote computing device determines the proximity sensor height by identifying the proximity sensor height in a memory or by determining a particular proximity sensor height based on a three-dimensional map of the environment. In some embodiments, the autonomous vehicle is in a location that does not have a pre-associated proximity sensor height. In some examples, the autonomous vehicle determines the proximity sensor height using sensor readings from an on-board sensor, such as by moving the movable proximity sensor through a range of the possible heights of the movable proximity sensor and selecting one of the possible heights as the proximity sensor height based on the readings of the movable proximity sensor through the range of possible heights. In some examples, the location that does not have a pre-associated proximity sensor height may be an unknown location or an unmapped location.
At block 706, a proximity sensor on the autonomous vehicle is positioned on the autonomous vehicle based on the proximity sensor height. In one example, the autonomous vehicle includes a movement mechanism configured to move the proximity sensor on the autonomous vehicle. In some cases, the movement mechanism is instructed by a processing element on the autonomous vehicle or a remote computing device to move the proximity sensor based on the proximity sensor height. In some embodiments, the proximity sensor is configured to be placed at one of a number of distinct sensor heights. In other embodiments, the proximity sensor is configured to be placed at one of any number of heights within a range of heights.
At block 708, a signal is received from the proximity sensor indicative of a distance to an object at the proximity sensor height. In some embodiments, the proximity sensor emits a field, such as electromagnetic energy or sound waves, and detects reflection of that field to determine a distance to the object. In some embodiments, the signal indicative of the distance to the object is received by a component in the autonomous vehicle (e.g., a processing element) and/or a remote computing device. In some embodiments, the distance to the object is the distance to an expected portion of the object (e.g., a portion of the produce island 526 above the bumper 542, a portion of the kickplate 546 below the individual shelves 544, etc.). In some cases, the autonomous vehicle and/or the remote computing device that receives the signal indicative of the distance to the object is configured to estimate a location of a different portion of the object (e.g., the end of the bumper 542, the end of the individual shelves 544, etc.) based on the signal indicative of the distance to the object.
At block 710, one or more operations of the autonomous vehicle are controlled based on the signal from the proximity sensor indicative of the distance to the object. In some embodiments, the one or more operations of the autonomous vehicle include a position of the autonomous vehicle, an orientation of the autonomous vehicle, a speed of the autonomous vehicle, an acceleration of the autonomous vehicle, a type of floor cleaning performed by the autonomous vehicle, any other operation of the autonomous vehicle, or any combination thereof. In one embodiment, the signal from the proximity sensor is used for navigation guidance and the orientation and/or speed of the autonomous vehicle is controlled to maintain a particular route. In another embodiment, the signal from the proximity sensor is used for object avoidance and the orientation and/or speed of the autonomous vehicle is controlled to avoid an object in its path. In other embodiments any other operation of the autonomous vehicle is controlled based on the signal from the proximity sensor. In some embodiments, controlling the operation of the autonomous vehicle is performed by the autonomous vehicle and/or a remote computing device.
The embodiment of the method 700 is depicted as a series of steps performed in a particular order. It should be noted that, in other embodiments, some of the steps may be performed in a different order than the order presented in
The computing devices 820 are communicatively coupled to each other via one or more networks 830 and 832. Each of the networks 830 and 832 may include one or more wired or wireless networks (e.g., a 3G network, the Internet, an internal network, a proprietary network, a secured network). The computing devices 820 are capable of communicating with each other and/or any other computing devices via one or more wired or wireless networks. While the particular system 810 in
In the depicted embodiment, the computing device 8203 is communicatively coupled with a peripheral device 840 via the network 832. In the depicted embodiment, the peripheral device 840 is a scanner, such as a barcode scanner, an optical scanner, a computer vision device, and the like. In some embodiments, the network 832 is a wired network (e.g., a direct wired connection between the peripheral device 840 and the computing device 8203), a wireless network (e.g., a Bluetooth connection or a WiFi connection), or a combination of wired and wireless networks (e.g., a Bluetooth connection between the peripheral device 840 and a cradle of the peripheral device 840 and a wired connection between the peripheral device 840 and the computing device 8203). In some embodiments, the peripheral device 840 is itself a computing device (sometimes called a “smart” device). In other embodiments, the peripheral device 840 is not a computing device (sometimes called a “dumb” device).
Depicted in
In the depicted embodiment, the computing device 900 includes a processing element 905, memory 910, a user interface 915, and a communications interface 920. The processing element 905, memory 910, a user interface 915, and a communications interface 920 are capable of communicating via a communication bus 925 by reading data from and/or writing data to the communication bus 925. The computing device 900 may include other components that are capable of communicating via the communication bus 925. In other embodiments, the computing device does not include the communication bus 925 and the components of the computing device 900 are capable of communicating with each other in some other way.
The processing element 905 (also referred to as one or more processors, processing circuitry, and/or similar terms used herein) is capable of performing operations on some external data source. For example, the processing element may perform operations on data in the memory 910, data receives via the user interface 915, and/or data received via the communications interface 920. As will be understood, the processing element 905 may be embodied in a number of different ways. In some embodiments, the processing element 905 includes one or more complex programmable logic devices (CPLDs), microprocessors, multi-core processors, co processing entities, application-specific instruction-set processors (ASIPs), microcontrollers, controllers, integrated circuits, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), programmable logic arrays (PLAs), hardware accelerators, any other circuitry, or any combination thereof. The term circuitry may refer to an entirely hardware embodiment or a combination of hardware and computer program products. In some embodiments, the processing element 905 is configured for a particular use or configured to execute instructions stored in volatile or nonvolatile media or otherwise accessible to the processing element 905. As such, whether configured by hardware or computer program products, or by a combination thereof, the processing element 905 may be capable of performing steps or operations when configured accordingly.
The memory 910 in the computing device 900 is configured to store data, computer-executable instructions, and/or any other information. In some embodiments, the memory 910 includes volatile memory (also referred to as volatile storage, volatile media, volatile memory circuitry, and the like), non-volatile memory (also referred to as non-volatile storage, non-volatile media, non-volatile memory circuitry, and the like), or some combination thereof.
In some embodiments, volatile memory includes one or more of random access memory (RAM), dynamic random access memory (DRAM), static random access memory (SRAM), fast page mode dynamic random access memory (FPM DRAM), extended data-out dynamic random access memory (EDO DRAM), synchronous dynamic random access memory (SDRAM), double data rate synchronous dynamic random access memory (DDR SDRAM), double data rate type two synchronous dynamic random access memory (DDR2 SDRAM), double data rate type three synchronous dynamic random access memory (DDR3 SDRAM), Rambus dynamic random access memory (RDRAM), Twin Transistor RAM (TTRAM), Thyristor RAM (T-RAM), Zero-capacitor (Z-RAM), Rambus in-line memory module (RIMM), dual in-line memory module (DIMM), single in-line memory module (SIMM), video random access memory (VRAM), cache memory (including various levels), flash memory, any other memory that requires power to store information, or any combination thereof.
In some embodiments, non-volatile memory includes one or more of hard disks, floppy disks, flexible disks, solid-state storage (SSS) (e.g., a solid state drive (SSD)), solid state cards (SSC), solid state modules (SSM), enterprise flash drives, magnetic tapes, any other non-transitory magnetic media, compact disc read only memory (CD ROM), compact disc-rewritable (CD-RW), digital versatile disc (DVD), Blu-ray disc (BD), any other non-transitory optical media, read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory (e.g., Serial, NAND, NOR, and/or the like), multimedia memory cards (MMC), secure digital (SD) memory cards, Memory Sticks, conductive-bridging random access memory (CBRAM), phase-change random access memory (PRAM), ferroelectric random-access memory (FeRAM), non-volatile random access memory (NVRAM), magneto-resistive random access memory (MRAM), resistive random-access memory (RRAM), Silicon Oxide-Nitride-Oxide-Silicon memory (SONOS), floating junction gate random access memory (FJG RAM), Millipede memory, racetrack memory, any other memory that does not require power to store information, or any combination thereof.
In some embodiments, memory 910 is capable of storing one or more of databases, database instances, database management systems, data, applications, programs, program modules, scripts, source code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, or any other information. The term database, database instance, database management system, and/or similar terms used herein may refer to a collection of records or data that is stored in a computer-readable storage medium using one or more database models, such as a hierarchical database model, network model, relational model, entity relationship model, object model, document model, semantic model, graph model, or any other model.
The user interface 915 of the computing device 900 is in communication with one or more input or output devices that are capable of receiving inputs into and/or outputting any outputs from the computing device 900. Embodiments of input devices include a keyboard, a mouse, a touchscreen display, a touch sensitive pad, a motion input device, movement input device, an audio input, a pointing device input, a joystick input, a keypad input, peripheral device 840, foot switch, and the like. Embodiments of output devices include an audio output device, a video output, a display device, a motion output device, a movement output device, a printing device, and the like. In some embodiments, the user interface 915 includes hardware that is configured to communicate with one or more input devices and/or output devices via wired and/or wireless connections.
The communications interface 920 is capable of communicating with various computing devices and/or networks. In some embodiments, the communications interface 920 is capable of communicating data, content, and/or any other information, that can be transmitted, received, operated on, processed, displayed, stored, and the like. Communication via the communications interface 920 may be executed using a wired data transmission protocol, such as fiber distributed data interface (FDDI), digital subscriber line (DSL), Ethernet, asynchronous transfer mode (ATM), frame relay, data over cable service interface specification (DOCSIS), or any other wired transmission protocol. Similarly, communication via the communications interface 920 may be executed using a wireless data transmission protocol, such as general packet radio service (GPRS), Universal Mobile Telecommunications System (UMTS), Code Division Multiple Access 2000 (CDMA2000), CDMA2000 1x (1xRTT), Wideband Code Division Multiple Access (WCDMA), Global System for Mobile Communications (GSM), Enhanced Data rates for GSM Evolution (EDGE), Time Division-Synchronous Code Division Multiple Access (TD-SCDMA), Long Term Evolution (LTE), Evolved Universal Terrestrial Radio Access Network (E-UTRAN), Evolution-Data Optimized (EVDO), High Speed Packet Access (HSPA), High-Speed Downlink Packet Access (HSDPA), IEEE 802.11 (WiFi), WiFi Direct, 802.16 (WiMAX), ultra wideband (UWB), infrared (IR) protocols, near field communication (NFC) protocols, Wibree, Bluetooth protocols, wireless universal serial bus (USB) protocols, or any other wireless protocol.
As will be appreciated by those skilled in the art, one or more components of the computing device 900 may be located remotely from other components of the computing device 900 components, such as in a distributed system. Furthermore, one or more of the components may be combined and additional components performing functions described herein may be included in the computing device 900. Thus, the computing device 900 can be adapted to accommodate a variety of needs and circumstances. The depicted and described architectures and descriptions are provided for exemplary purposes only and are not limiting to the various embodiments described herein.
Embodiments described herein may be implemented in various ways, including as computer program products that comprise articles of manufacture. A computer program product may include a non-transitory computer-readable storage medium storing applications, programs, program modules, scripts, source code, program code, object code, byte code, compiled code, interpreted code, machine code, executable instructions, and/or the like (also referred to herein as executable instructions, instructions for execution, computer program products, program code, and/or similar terms used herein interchangeably). Such non-transitory computer-readable storage media include all computer-readable media (including volatile and non-volatile media).
As should be appreciated, various embodiments of the embodiments described herein may also be implemented as methods, apparatus, systems, computing devices, and the like. As such, embodiments described herein may take the form of an apparatus, system, computing device, and the like executing instructions stored on a computer readable storage medium to perform certain steps or operations. Thus, embodiments described herein may be implemented entirely in hardware, entirely in a computer program product, or in an embodiment that comprises combination of computer program products and hardware performing certain steps or operations.
Embodiments described herein may be made with reference to block diagrams and flowchart illustrations. Thus, it should be understood that blocks of a block diagram and flowchart illustrations may be implemented in the form of a computer program product, in an entirely hardware embodiment, in a combination of hardware and computer program products, or in apparatus, systems, computing devices, and the like carrying out instructions, operations, or steps. Such instructions, operations, or steps may be stored on a computer readable storage medium for execution by a processing element in a computing device. For example, retrieval, loading, and execution of code may be performed sequentially such that one instruction is retrieved, loaded, and executed at a time. In some exemplary embodiments, retrieval, loading, and/or execution may be performed in parallel such that multiple instructions are retrieved, loaded, and/or executed together. Thus, such embodiments can produce specifically configured machines performing the steps or operations specified in the block diagrams and flowchart illustrations. Accordingly, the block diagrams and flowchart illustrations support various combinations of embodiments for performing the specified instructions, operations, or steps.
For purposes of this disclosure, terminology such as “upper,” “lower,” “vertical,” “horizontal,” “inwardly,” “outwardly,” “inner,” “outer,” “front,” “rear,” and the like, should be construed as descriptive and not limiting the scope of the claimed subject matter. Further, the use of “including,” “comprising,” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. Unless limited otherwise, the terms “connected,” “coupled,” and “mounted” and variations thereof herein are used broadly and encompass direct and indirect connections, couplings, and mountings. Unless stated otherwise, the terms “substantially,” “approximately,” and the like are used to mean within 5% of a target value.
The principles, representative embodiments, and modes of operation of the present disclosure have been described in the foregoing description. However, aspects of the present disclosure which are intended to be protected are not to be construed as limited to the particular embodiments disclosed. Further, the embodiments described herein are to be regarded as illustrative rather than restrictive. It will be appreciated that variations and changes may be made by others, and equivalents employed, without departing from the spirit of the present disclosure. Accordingly, it is expressly intended that all such variations, changes, and equivalents fall within the spirit and scope of the present disclosure, as claimed.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US2018/021698 | 3/9/2018 | WO | 00 |
Number | Date | Country | |
---|---|---|---|
62469579 | Mar 2017 | US |