This application claims the benefit of U.S. Provisional Application No. 62,883,629, filed Aug. 6, 2019, the disclosures of which is incorporated, in its entirety, by this reference.
The accompanying drawings illustrate a number of exemplary embodiments and are a part of the specification. Together with the following description, these drawings demonstrate and explain various principles of the present disclosure.
Throughout the drawings, identical reference characters and descriptions indicate similar, but not necessarily identical, elements. While the exemplary embodiments described herein are susceptible to various modifications and alternative forms, specific embodiments have been shown by way of example in the drawings and will be described in detail herein. However, the exemplary embodiments described herein are not intended to be limited to the particular forms disclosed. Rather, the present disclosure covers all modifications, equivalents, and alternatives falling within the scope of the appended claims.
The present disclosure is generally directed to apparatuses, systems, and methods for robotic datacenter monitoring. Datacenters may include and/or represent sites for housing numerous computing devices that store, process, and/or transmit data (e.g., digital data). The computing devices housed in datacenters may benefit from certain types of monitoring capable of uncovering unexpected needs and/or failures. In some examples, such monitoring may lead to the discovery of certain maintenance, replacement, and/or upgrading needs among the computing devices and/or their surrounding environments. Additionally or alternatively, such monitoring may lead to the discovery and/or detection of unexpected failures among the computing devices and/or their surrounding environments.
As will be described in greater detail below, by monitoring datacenters for such unexpected needs and/or failures, the various apparatuses, systems, and methods disclosed herein may be able to discover certain maintenance, replacement, and/or upgrading needs or certain device failures and/or concerns in advance or with minimal downtime. In one example, an unexpected temperature increase or electrical load increase may indicate that one or more computing devices have failed or may soon fail. In this example, the various apparatuses, systems, and methods disclosed herein may sense such an increase and then determine that one or more of those computing devices have failed or may soon fail based at least in part on that increase.
In another example, certain environmental constraints, such as temperature range and/or humidity range, may affect and/or improve computing operations and/or performance in datacenters. In this example, the various apparatuses, systems, and methods disclosed herein may sense a change in temperature and/or humidity and then perform one or more actions (e.g., notify an administrator and/or modify the temperature or humidity) in response to the sensed change.
The following will provide, with reference to
In some embodiments, robotic monitoring system 100 may include and/or be implemented with a subset (e.g., less than all) of the features, components, and/or subsystems illustrated in
Additionally or alternatively, although illustrated separately in
In some examples, mobility subsystem 102 may include and/or represent certain components that facilitate moving, driving, and/or steering robotic monitoring system 100 in and/or around a datacenter. Examples of such components include, without limitation, motors (such as direct current motors, alternating current motors, vibration motors, brushless motors, switched reluctance motors, synchronous motors, rotary motors, servo motors, coreless motors, stepper motors, and/or universal motors), axles, gears, drivetrains, wheels, treads, steering mechanisms, circuitry, electrical components, processing devices, memory devices, circuit boards, power sources, wiring, batteries, communication buses, combinations or variations of one or more of the same, and/or any other suitable components. In one example, one or more of these components may move, turn, and/or rotate to drive or implement locomotion for robotic monitoring system 100.
In some examples, mobility subsystem 102 may include and/or represent a computation assembly (including, e.g., at least one processor and associated computational elements, memory, and/or wireless or wired communication interfaces), a drivetrain (including, e.g., at least one motor and/or wheels), a navigation sensing assembly (including, e.g., a proximity sensor, an accelerometer, a gyroscope, and/or a location sensor), power systems (including, e.g., a power source, a power transmission element, a power supply element, and/or a charging element), and/or an emergency stop feature (e.g., a brake).
In some examples, sensors 104(1)-(N) may facilitate and/or perform various sensing, detection, and/or identification functions for robotic monitoring system 100. Examples of sensors 104(1)-(N) include, without limitation, active or passive radio-frequency identification sensors, real time location systems, vision-based barcode scanners, ultra-wideband sensors, video cameras, computer or machine vision equipment, infrared cameras, audio microphones or sensors, pressure sensors, liquid sensors, Three-dimensional (“3D”) LiDAR sensors, air velocity sensors (3D speed and/or direction), high-resolution machine vision cameras, temperature sensors, humidity sensors, leak detectors, proximity sensors, time-of-flight depth sensors, single-beam or sweeping laser rangefinders, heat sensors, motion sensors, gyroscopes, combinations or variations of one or more of the same, and/or any other suitable sensors.
In some examples, payload subsystem 106 and/or user and payload interface subsystem 112 may include and/or represent certain components that support peripherals and/or sensing elements, such as sensors 104(1)-(N), on robotic monitoring system 100. Examples of such components include, without limitation, circuitry, electrical components, processing devices, circuit boards, user interfaces, input ports, input devices, wiring, communication buses, combinations or variations of one or more of the same, and/or any other suitable components. In one example, one or more of these components may move, turn, and/or rotate to drive or implement locomotion for robotic monitoring system 100. In one example, payload subsystem 106 and/or user and payload interface subsystem 112 may include a mast that supports peripherals and sensing elements and/or connects the same to robotic monitoring system 100. Such peripherals and/or sensing elements may be designed for datacenter and/or point-of-presence site (POP-site) applications.
In some examples, payload subsystem 106 and/or user and payload interface subsystem 112 may include video-calling hardware infrastructure that enables a remote user to participate in a video call with a local user at and/or via robotic monitoring system 100. Such a video call may enable the remote user to view and/or evaluate different regions of the datacenter and/or to communicate with the local user at or near robotic monitoring system 100 in the datacenter. In one embodiment, the mast may also support one or more flash elements and/or light sources positioned to illuminate certain features and/or targets within the datacenter and/or to improve image captures.
In some examples, computation and navigation subsystem 108 may include and/or represent components that facilitate and/or perform calculations, decision-making, navigation, issue detection, data storage or collection, output generation, transmission controls, security controls, and/or periphery or sensory controls. Examples of such components include, without limitation, circuitry, electrical components, processing devices, memory devices, circuit boards, wiring, communication buses, combinations or variations of one or more of the same, and/or any other suitable components. In one example, computation and navigation subsystem 108 may direct and/or control the functionality of one or more of the other features, components, and/or subsystems (e.g., mobility subsystem 102, transmission subsystem 110, rack dolly subsystem 114, robotic arm 116, etc.) illustrated in
In certain embodiments, one or more of modules 202 in
As illustrated in
As illustrated in
Returning to
In some examples, rack dolly subsystem 114 and/or robotic arm 116 may include and/or represent components that facilitate moving, replacing, and/or relocating hardware and/or devices in the datacenter. Examples of such components include, without limitation, actuators, motors, pins, rods, levers, shafts, arms, knobs, circuitry, electrical components, processing devices, memory devices, circuit boards, wiring, communication buses, combinations or variations of one or more of the same, and/or any other suitable components. In one example, rack dolly subsystem 114 and/or robotic arm 116 may grasp, hold, lift, and/or release hardware and/or devices in the datacenter.
As illustrated in
In some examples, robotic monitoring system 100 may represent and/or provide a platform designed for modularity. For example, mobility subsystem 102, computation and navigation subsystem 108, user and payload interface subsystem 112, and/or payload subsystem 106 may represent different modules capable of being assembled as and/or installed on robotic monitoring system 100. In this example, one or more of these modules may be omitted, excluded, and/or removed from robotic monitoring system 100 while the other modules remain intact as part of robotic monitoring system 100. Moreover, additional modules (not necessarily illustrated in
In some examples, exemplary robotic monitoring system 100 in
In some examples, network 304 may include and/or represent various network devices that form and/or establish communication paths and/or segments. For example, network 304 may include and/or represent one or more segment communication paths or channels. Although not necessarily illustrated in this way in
In some examples, and as will be described in greater detail below, one or more of modules 202 may cause datacenter monitoring system 300 to (1) deploy robotic monitoring systems 100(1)-(N) within a datacenter such that monitoring systems 100(1)-(N) collect information about the datacenter via one or more of sensor 104(1)-(N) as monitoring systems 100(1)-(N) move through the datacenter and transmit the information about the datacenter to data integration system 302, (2) analyze the information about the datacenter at data integration system 302, (3) identify at least one suspicious issue that needs attention within the datacenter based at least in part on the analysis of the information, and then (4) perform at least one action directed to addressing the suspicious issue in response to identifying the at least one suspicious issue.
Data integration system 302 generally represents any type or form of physical computing device or system capable of reading computer-executable instructions, integrating information collected across various robotic monitoring systems, and/or presenting the integrated information for consumption. Examples of data integration system 302 include, without limitation, servers, client devices, laptops, tablets, desktops, servers, cellular phones, Personal Digital Assistants (PDAs), multimedia players, embedded systems, wearable devices, gaming consoles, network devices or interfaces, variations or combinations of one or more of the same, and/or any other suitable data integration systems.
Network 304 generally represents any medium or architecture capable of facilitating communication or data transfer. In some examples, network 304 may include other devices not illustrated in
As illustrated in
As illustrated in
In some examples, mobile data-collection robots 430(1)-(5) may sense data and/or information about datacenter 404 in a variety of different ways. For example, mobile data-collection robots 430(1)-(5) may read radio-frequency identification tags mounted to datacenter components 410(1)-(4) within datacenter 404 via one or more of sensors 104(1)-(N). In another example, mobile data-collection robots 430(1)-(5) may read certain types of barcodes mounted to datacenter components 410(1)-(4) within datacenter 404 via one or more of sensors 104(1)-(N). By doing so, mobile data-collection robots 430(1)-(5) may obtain and/or receive data and/or information conveyed and/or relayed by the radio-frequency identification tags and/or barcodes.
In a further example, mobile data-collection robots 430(1)-(5) may capture and/or record video and/or photographic images via one or more of sensors 104(1)-(N). In this example, mobile data-collection robots 430(1)-(5) may store these video and/or photographic images and/or process the same via computer or machine vision technology.
In some examples, mobile data-collection robots 430(1)-(5) may report, deliver, and/or transmit the data and/or information sensed within datacenter 404 to data integration system 302. Additionally or alternatively, mobile data-collection robots 430(1)-(5) may process and/or format all or portions of the data and/or information sensed within datacenter 404 prior to performing such transmissions. For example, mobile data-collection robots 430(1)-(5) may generate heat maps, spatial maps, and/or security-alert maps based at least in part on the data and/or information prior to transmitting the same to data integration system 302. In one example, the heat maps may represent and/or be based on temperatures and/or temperature variances detected at datacenter 404. In another example, the heat maps may represent and/or be based on wireless communication signal variances, such as WiFi or Long-Term Evolution (LTE) signal strengths and/or stretches, detected at datacenter 404,
In one example, data integration system 302 may gather, aggregate, and/or integrate the data and/or information as sensed across mobile data-collection robots 430(1)-(5). In this example, data integration system 302 may process and/or format all or portions of the data and/or information sensed by mobile data-collection robots 430(1)-(5). For example, data integration system 302 may generate heat maps, spatial maps, and/or security-alert maps based at least in part on the data and/or information received from mobile data-collection robots 430(1)-(5).
In some examples, data integration system 302 may present and/or display at least some of the data and/or information to an administrator of datacenter 404 (via, e.g., a report and/or user interface). Additionally or alternatively, data integration system 302 may provide an administrator operating another computing device with remote access to at least some of the data and/or information.
In some examples, data integration system 302 and/or mobile data-collection robots 430(1)-(5) may notify an administrator of datacenter 404 about certain security, performance, and/or environmental issues based at least in part on the data and/or information. In one example, data integration system 302 may propagate and/or distribute the data and/or information sensed by mobile data-collection robots 430(1)-(5) to other computing devices associated with the same organization, service provider, and/or customer as the area of datacenter 404 at which the data and/or information was sensed.
In some examples, mobile data-collection robots 430(1)-(5) and/or data integration system 302 may perform certain actions in response to any suspicious issues and/or concerns detected within an area of datacenter 404. For example, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may detect and/or discover an unsuitable temperature and/or humidity within a certain area of datacenter 404 based at least in part on information sensed in that area. In this example, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may notify the responsible temperature and/or humidity controller of the unsuitable temperature and/or humidity. Additionally or alternatively, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may direct and/or instruct the responsible temperature and/or humidity controller to modify the temperature and/or humidity within that area of datacenter 404 to correct and/or adjust the unsuitable temperature and/or humidity.
As another example, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may detect and/or discover flooding and/or an unexpected leak within a certain area of datacenter 404 based at least in part on information sensed in that area. In this example, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may notify the responsible fluid controller of the flooding and/or unexpected leak. Additionally or alternatively, one of mobile data-collection robots 430(1)-(5) and/or data integration system 302 may direct and/or instruct the responsible fluid controller to shut down and/or close the flow of fluid (e.g., water) to correct and/or fix the flooding or unexpected leak.
In some examples, the data and/or information sensed by mobile data-collection robots 430(1)-(5) may touch and/or traverse various computing layers across datacenter 404. For example, the data and/or information sensed by mobile data-collection robots 430(1)-(5) may be integrated into the existing computing infrastructure within datacenter 404 and/or at another site associated with the corresponding organization, service provider, and/or customer. In one example, mobile data-collection robots 430(1)-(5) may collect data and/or information about datacenter 404 and then transfer the same to a backend device (e.g., data integration system 302). In this example, another device (not necessarily illustrated in
In some embodiments, one or more of radio-frequency identification tags 1222(1)-(10) may include and/or be coupled to active or passive temperature-sensing equipment. In one embodiment, radio-frequency identification tags 1222(1)-(10) may be configured and/or set to produce data representative of surface temperatures along datacenter components 410(1) and 410(2). Additionally or alternatively, radio-frequency identification tags 1222(1)-(10) may be configured and/or set to produce data representative of device temperatures along datacenter components 410(1) and 410(2).
In some embodiments, one or more of radio-frequency identification tags 1222(1)-(10) may be programmed and/or configured to provide identification information specific to a certain device incorporated in datacenter components 410(1) or 410(2). For example, radio-frequency identification tags 1222(1) may be programmed and/or configured with information specific to a server rack 1100 in
As a specific example, robotic monitoring system 100(1) may navigate through aisle 420(1) to read information from one or more of radio-frequency identification tags 1222(1)-(5) mounted to datacenter components 410(1). In this example, robotic monitoring system 100(1) may also navigate through aisle 420(2) to read information from one or more of radio-frequency identification tags 1222(6)-(10) mounted to datacenter components 410(2). In one embodiment, the information read from radio-frequency identification tags 1222(1)-(10) may indicate and/or identify current and/or historical temperatures measured at their respective sites and/or positions. In another embodiment, the information read from radio-frequency identification tags 1222(1)-(10) may indicate and/or identify current and/or historical temperatures of one or more electrical and/or computing components installed in server racks along aisles 420(1) and 410(2).
In an additional embodiment, the information read from radio-frequency identification tags 1222(1)-(10) may indicate and/or identify specific assets and/or resources installed and/or running in datacenter components 410(1) or 410(2) within datacenter 404. In one example, robotic monitoring system 100(1) may map and/or associate those assets and/or resources to specific locations and/or positions along datacenter components 410(1) or 410(2) within datacenter 404. In this example, robotic monitoring system 100(1) may transmit at least some of the information read from radio-frequency identification tags 1222(1)-(10) to data integration system 302. By doing so, robotic monitoring system 100(1) may facilitate tracking those assets and/or resources within datacenter 404.
As another example, robotic monitoring system 100(1) may navigate through aisle 420(1) or 420(2) to capture video and/or image data representative of the corresponding environment via high-resolution cameras. In this example, robotic monitoring system 100(1) may feed that video and/or image data to a computer or machine vision application for processing. In various embodiments, robotic monitoring system 100(1) may implement and/or apply one or more artificial intelligence and/or machine learning models.
In some examples, robotic monitoring system 100(1) may implement one or more machine learning algorithms and/or models to facilitate the spatial mapping of datacenter 404 and/or the detection of potential security, performance, and/or environmental concerns. For example, robotic monitoring system 100(1) may be programmed and/or configured with a fully and/or partially constructed machine learning model (such as a convolutional neural network and/or a recurrent neural network). In one example, robotic monitoring system 100(1) may include and/or incorporate a storage device that stores the machine learning model. The machine learning model may be trained and/or constructed with training data that includes various samples of spatial mapping imagery and/or issue detection.
Some of these samples may represent and/or be indicative of certain image and/or video captures. These samples may constitute positive data for the purpose of training the machine learning model with respect to certain surroundings and/or features within datacenter 404. Other samples may represent and/or be indicative of other surroundings and/or features within datacenter 404. These other samples may constitute negative data for the purpose of training the machine learning model with respect to those certain surroundings and/or features within datacenter 404.
In some examples, one or more of these samples may be supplied and/or provided from other similar datacenters for the purpose of training the machine learning model to datacenter 404. Additionally or alternatively, one or more of these samples may be supplied and/or developed by robotic monitoring system 100(1) operating in datacenter 404. For example, robotic monitoring system 100(1) may calibrate and/or train the machine learning model implemented on robotic monitoring system 100(1) to recognize certain surroundings or features and/or to spatially map datacenter 404.
Upon training and/or calibrating the machine learning model, robotic monitoring system 100(1) may be able to classify and/or identify certain features captured and/or shown in subsequent video and/or images. For example, robotic monitoring system 100(1) may detect, via the machine learning model, a pattern indicative of certain surroundings and/or features within those videos and/or images. In this example, robotic monitoring system 100(1) and/or data integration system 302 may then use the detection of such surroundings and/or features to spatially map datacenter 404 and/or perform localization on the same.
As a specific example, the machine learning model may represent a convolutional neural network that includes various layers, such as one or more convolution layers, activation layers, pooling layers, and fully connected layers. In this example, robotic monitoring system 100(1) may pass video and/or image data through the convolutional neural network to classify and/or identify certain surroundings and/or features represented in the video and/or image data.
In the convolutional neural network, the video and/or image data may first encounter the convolution layer. At the convolution layer, the video and/or image data may be convolved using a filter and/or kernel. In particular, the video and/or image data may cause computation and navigation subsystem 108 to slide a matrix function window over and/or across the video and/or image data. Computation and navigation subsystem 108 may then record the resulting data convolved by the filter and/or kernel. In one example, one or more nodes included in the filter and/or kernel may be weighted by a certain magnitude and/or value.
After completion of the convolution layer, the convolved representation of the video and/or image data may encounter the activation layer. At the activation layer, the convolved data in the video and/or image data may be subjected to a non-linear activation function. In one example, the activation layer may cause computation and navigation subsystem 108 to apply the non-linear activation function to the convolved data in the video and/or image data. By doing so, computation and navigation subsystem 108 may be able to identify and/or learn certain non-linear patterns, correlations, and/or relationships between different regions of the convolved data in the electrical response.
In some examples, computation and navigation subsystem 108 may apply one or more of these layers included in the convolutional neural network to the video and/or image data multiple times. As the video and/or image data completes all the layers, the convolutional neural network may render a classification for the video and/or image data. In one example, the classification may indicate that a certain feature captured in the video and/or image data is indicative of a known feature, device, and/or structure.
In some examples, robotic monitoring systems 100(1)-(N) may implement cross-check security features to authenticate the identities of personnel within datacenter 404. For example, robotic monitoring system 100(1) may encounter personnel wandering the aisles of datacenter 404. In this example, robotic monitoring system 100(1) may obtain identification credentials (e.g., name, employee number, department, job title, etc.) from a badge and/or radio-frequency identification tag worn by the personnel via one or more of sensors 104(1)-(N).
Continuing with this example, robotic monitoring system 100(1) may obtain image data (e.g., video and/or still photography) of the personnel detected with datacenter 404. In one example, robotic monitoring system 100(1) may receive and/or access existing photographic images of the personnel from an employee identification database. Additionally or alternatively, computation and navigation subsystem 108 may include a facial recognition interface that obtains image data that is captured of the personnel during the encounter. In this example, computation and navigation subsystem 108 may determine any suspected identities of the personnel based at least in part on the image data captured during the encounter.
In one example, computation and navigation subsystem 108 may include a security interface that compares the identification credentials obtained from the personnel to the suspected identities of the personnel. In this example, the security interface may determine whether the identification credentials from the personnel match and/or correspond to the suspected identifies of the personnel. On the one hand, if the identification credentials match the suspected identity of the person encountered in datacenter 404, robotic monitoring system 100(1) may effectively confirm that the person is represented correctly and/or accurately by his or her identification credentials, thereby authenticating his or her identity. On the other hand, if the identification credentials do not match the suspected identity of the person encountered in datacenter 404, robotic monitoring system 100(1) may effectively confirm that the person is potentially misrepresenting himself or herself by the identification credentials worn while wandering datacenter 404. This potential misrepresentation may constitute and/or amount to a security concern that needs attention from an administrator.
In some examples, robotic monitoring systems 100(1)-(N) and/or data integration system 302 may identify and/or determine high foot-traffic areas within datacenter 404. In one example, one or more of robotic monitoring systems 100(1)-(N) may be deployed to those high foot-traffic areas at less busy times (e.g., once the level of foot traffic decreases) for the purpose of sanitizing those areas with ultraviolet light and/or acoustic vibration generators. By doing so, one or more of robotic monitoring systems 100(1)-(N) may be able to mitigate the risk of viral spreading within those areas.
In some examples, field-replaceable units 1102(1)-(3) may constitute and/or represent a modular device that includes one or more ports and/or interfaces for carrying and/or forwarding network traffic. Examples of field-replaceable units 1102(1)-(3) include, without limitation, PICs, FPCs, SIBs, linecards, control boards, routing engines, communication ports, fan trays, connector interface panels, servers, network devices or interfaces, routers, optical modules, service modules, rackmount computers, portions of one or more of the same, combinations or variations of one or more of the same, and/or any other suitable FRUs.
As illustrated in
At step 1320 in
At step 1330 in
At step 1340 in
As described above in connection with
In some examples, the mobility subsystem and the computation and navigation subsystem of the robotic system may be a core unit of the robotic system. The mobility subsystem and the computation and navigation subsystem may include a computation assembly (including, e.g., at least one processor and associated computational elements, memory, and/or a communication element, such as a wireless or a wired communication element, etc.), a drivetrain (including, e.g., at least one motor, and/or wheels, etc.), a navigation sensing assembly (including, e.g., a proximity sensor, an accelerometer, a gyroscope, and/or a location sensor, etc.), power systems (including, e.g., a power source, such as a battery, a power transmission element, a power supply element, and/or a charging element, etc.), and/or an emergency stop element (e.g., a brake).
In some examples, the user and payload interface subsystem and the payload subsystem may include a peripherals and sensing mast. This mast may be configured to support peripherals and sensing elements, such as for monitoring the datacenter. For example, the peripherals and sensing elements may be designed for datacenter and POP-site applications. Video-calling hardware infrastructure may also be included for a remote user to participate in a video call at the robotic system, to view the datacenter, and/or to communicate with a local user at or near the robotic system in the datacenter. The peripherals and sensing elements may also include one or more radio-frequency identification readers, such as to track assets (e.g., computing devices, infrastructure elements, etc.), to read information from radio-frequency identification badges, and/or to monitor temperature at radio-frequency identification tags positioned in the datacenter. Such radio-frequency identification tags are discussed further below. The peripherals and sensing elements may also include one or more cameras, such as high-definition cameras, for machine vision applications and/or for remote visual monitoring of the datacenter. Flash elements, such as custom flash bars, may be positioned on the mast to provide a light source to improve image captures.
In some examples, radio-frequency identification tags may be used to identify computing assets (e.g., servers, memory, processors, networking devices, etc.) and/or supporting infrastructure (e.g., racks, conduit, lighting, etc.). In some examples, temperature-sensing radio-frequency identification tags may be used to produce data corresponding to the temperature of an environment (e.g., air), surface, or device adjacent to the radio-frequency identification tag. For example, the radio-frequency identification tags and/or the robotic monitoring system may be configured to read hot aisle air temperature and/or cold aisle air temperature.
In some embodiments, a difference between intake air temperature and exhaust air temperature on servers may be measured. In addition, temperature-sensing radio-frequency identification tags may be employed to measure surface temperatures, such as on a busway to enable early detection of potential failures like arc flash failures. The robotic monitoring system may be configured to read identification data and/or temperature data from the radio-frequency identification tags. In the case of temperature-sensing, the radio-frequency identification tags may be positioned on or adjacent to devices or surfaces susceptible to overheating. Additionally or alternatively, the radio-frequency identification tags may provide an indication of part wear or failure in the form of heat. When an unexpected high temperature is sensed by a passing robotic monitoring device, a communication may be sent to maintenance personnel to check the area, device, or surface associated with the radio-frequency identification tag for potential maintenance or replacement.
In some examples, active (e.g., electrically powered) radio-frequency identification tags may be employed in the datacenter and configured to provide information to the robotic monitoring system. For example, active radio-frequency identification tags may be positioned on or near machines that have moving parts, such as large intake and exhaust fans on cooling/heating equipment, to provide analytics and feedback regarding operation and/or potential failures of these machines. In addition, active radio-frequency identification tags may be able to actively broadcast information to the robotic monitoring system at a longer range than passive radio-frequency identification tags.
In some examples, the payload interface may be a base unit designed for modularity. The payload interface may include a “breadboard” mechanical design and/or an electrical interface having electrical outputs and communications interfaces (e.g., power, ethernet, universal serial bus (“USB”), a serial port, a video connection port, etc.). A mechanical interface may include an array of holes for mechanically connecting devices or objects to the payload interface and/or for the robotic monitoring system to carry the devices or objects. The devices or objects carried by the payload interface may, in some cases, include a computing device that necessitates a connection to the robotic monitoring system by the electrical interface.
Various specifications of the robotic monitoring system may be possible. In some examples, values for each of the specifications may be selected by one skilled in the art, depending on an expected application for the robotic monitoring system. Thus, the values of the specifications outlined below are intended as an example of a particular way in which the robotic monitoring system may be configured.
By way of example and not limitation, the robotic monitoring system may fit within an 18-inch by 22-inch cross-sectional area, such as to fit so-called POP and datacenter applications. In some embodiments, the base weight may be approximately 46 kg, and the mast portion of the robotic monitoring system may have a weight of approximately 14 kg. A top speed of the example robotic monitoring system may be about 2 m/s (e.g., with software limits in place to reduce the speed for safety and/or effectiveness) with an average operating speed of about 0.5 m/s.
In some embodiments, the robotic monitoring system may be configured to achieve autonomous navigation in known, mapped-out spaces. In some examples, the robotic monitoring system may be powered by an onboard 480 watt-hour battery, which may provide about 8 hours of runtime per full charge. The robotic monitoring system may be configured and/or programmed to return to a docking station, such as for storage and/or recharging of the power source.
In some embodiments, the robotic monitoring system may be equipped for video calling, such as for a remote user to view a captured image at the robotic monitoring system's location and/or to display an image of the user at the robotic monitoring system, such as to communicate with a local user near the robotic monitoring system. For example, the robotic monitoring system may include at least one video camera, at least one display screen, at least one microphone, and/or at least one audio output device. The robotic monitoring system may also include computer vision systems and/or radio-frequency identification tracking elements, such as for asset tracking. In addition, the robotic monitoring system may include environmental sensing systems, such as to sense temperature, humidity, air pressure, etc.
In some examples, a datacenter may include temperature sensing elements on busways, hot aisle temperature profiling, and/or air temperature sensor arrays. Additionally or alternatively, the robotic monitoring system may include security features. For example, improved surveillance payloads (e.g., cameras movable along multiple axes, infrared cameras, etc.) may be included. In another example, the robotic monitoring system may include a leak detection system (e.g., a liquid-sensing system) to provide alerts in case of flooding or other liquid (e.g., water) leaks. By way of example and not limitation, humidity-sensing or moisture-sensing radio-frequency identification tags may be positioned in the datacenter under or near potential liquid sources (e.g., water pipes, coolant pipes, etc.). In some examples, the moisture-sensing (or other) radio-frequency identification tags may be positioned in locations that are out of a line-of-sight from aisles in the datacenter. The robotic monitoring system may read these radio-frequency identification tags when passing through a corresponding geographical area and may receive information regarding potential leaks.
In some examples, the robotic monitoring system may be capable of collecting a variety of data types. For example, the robotic monitoring system may include subsystems for collecting temperature data, generating heat maps, recording air flow data, monitoring air pressure, etc. In some examples, the robotic monitoring system may include elements configured for server rack movement. For example, a rack dolly system may be shaped, sized, and configured to lift a server rack and move the server rack to another location in a datacenter. The rack dolly system may include at least one lift mechanism and at least one roller element to lift the server racks and move the server racks to another location. The rack dolly system may improve safety and efficiency when moving racks relative to conventional (e.g., manual) methods. The rack dolly system may be used for deployments (e.g., installation), decommissions (e.g., removal), and shuffling of server racks within a datacenter.
In some examples, additional robotics concepts employed by the robotic monitoring system may include manipulation collaboration. For example, the robotic monitoring system may include and/or be used in conjunction with artificial intelligence and machine learning, such as to develop fundamental control algorithms for robust grasping and/or to develop computer vision improvements and protocols, etc. A framework for scalable systems (e.g., kinematic retargeting, sensor auto-recalibration, etc.) may be included in the robotic monitoring system. Such concepts may be applicable to infrastructure robotics efforts (e.g., to the robotic monitoring system for datacenters as disclosed herein).
In some examples, additional robotics concepts, such as hardware manipulation collaboration, may be implemented with the robotic monitoring system. For example, manipulation applications in manufacturing may be applicable to the robotic monitoring system. Hardware engineering and quality testing using robotic arms (e.g., network connectors) may be facilitated and/or controlled by the robotic monitoring system. Accordingly, during the production of datacenter infrastructure, the design and/or configuration may take into consideration robotic manipulation by the robotic monitoring system.
In some examples, the robotic monitoring system may also be configured for spatial computing mapping and localization. For example, spatial computing may be used to improve certain infrastructures. Three-dimensional (“3D”) mapping and localization may, in some examples, significantly improve the safety and/or reliability of robotic monitoring systems deployed in a datacenter. In addition, spatial computing mapping and localization may decrease the cost of sensor systems employed by the robotic monitoring systems, such as by providing mapping and localization data for robotic monitoring systems deployed in the datacenter. Robust and/or reliable data collection may be provided for experimentation with algorithms and/or other approaches. Such concepts may leverage mobile robots for client-side testing that addresses client-specific needs.
In some examples, spatial computing mapping and localization collaboration of the robotic monitoring system may be used in a number of applications, such as to map an area, to use computer vision to identify certain physical features in an area, and/or to provide augmented-reality mapping and direction systems.
In some examples, software specifications employed by or with the robotic monitoring system may include an application layer, a transport layer, a network layer, and/or a physical layer. For example, the application layer may include a graphical remote control user interface, future tools, etc. The transport layer may include software tools, web RTC, messenger, etc. The network layer may include software for connectivity, internal backend, etc. The physical layer may include software for wireless (e.g., WiFi, BLUETOOTH, etc.) connectivity, a modular sensor suite, etc.
In some examples, the robotic monitoring system may integrate with existing infrastructure. For example, the robotic monitoring system may collect data and/or transfer the collected data to a backend system where the data is accessed and/or processed. In this example, data-driven decisions may be made based at least in part on the data analysis. Such decisions may include and/or necessitate gathering additional data by the robotic monitoring system.
In some examples, the robotic monitoring system may have a number of system capabilities, such as for navigation, environmental sensing, telecommunications, asset tracking, and/or manipulation. By way of example and not limitation, the robotic monitoring system may include navigation mechanisms such as LIDAR-based SLAM systems, vision-based docking systems, and/or cloud-based map storage. In environmental sensing, the robotic monitoring system may include humidity sensing, temperature sensing, pressure sensing, leak detection, etc. In telecommunications, the robotic monitoring system may include video calling, audio calling, auto pick-up, etc. In asset tracking, the robotic monitoring system may include an radio-frequency identification reader, a vision-based barcode scanner, asset infrastructure integrations, etc. In manipulation, the robotic monitoring system may include guided pose-to-pose object grasping.
In some examples, the system capabilities described above may be used in a variety of combinations with one another. For example, the LIDAR-based SLAM systems may be used for guided pose-to-pose object grasping, the temperature sensing may be accomplished using an RFID tag reader, the video and audio calling may be used together, cloud-based map storage may be utilized in connection with auto pick-up and/or asset infrastructure integrations, and vision-based docking systems may be used in conjunction with a vision-based barcode scanner. Additional overlapping uses and systems may be employed by the robotic monitoring systems.
Example 1: A robotic monitoring system comprising (1) a mobility subsystem for moving the robotic monitoring system through a datacenter, (2) at least one sensor for sensing information about the datacenter as the robotic monitoring system moves through the datacenter, (3) a payload subsystem for mounting the at least one sensor to the robotic monitoring system, and/or (4) a computation and navigation subsystem for recording the information about the datacenter and controlling the mobility subsystem.
Example 2: The robotic monitoring system of Example 1, wherein the at least one sensor comprises at least one of (1) a radio-frequency identification sensor, (2) a video camera, (3) an infrared camera, (4) an audio microphone, (5) a pressure sensor, or (6) a liquid sensor.
Example 3: The robotic monitoring system of any of Examples 1 and 2, wherein the at least one sensor comprises a radio-frequency identification sensor configured to sense temperature information from one or more radio-frequency identification tags mounted in the datacenter.
Example 4: The robotic monitoring system of any of Examples 1-3, wherein the computation and navigation subsystem comprises a heat map generator configured to generate, based at least in part on temperatures identified within the temperature information, a heat map corresponding to at least a portion of the datacenter.
Example 5: The robotic monitoring system of any of Examples 1-4, wherein the at least one sensor comprises a radio-frequency identification sensor configured to sense asset-tracking information from one or more radio-frequency identification tags mounted in the datacenter.
Example 6: The robotic monitoring system of any of Examples 1-5, further comprising a transmission subsystem for transmitting the information about the datacenter to a data integration system configured to integrate sets of information about the datacenter as gathered by the robotic monitoring system and at least one additional robotic monitoring system while moving through the datacenter.
Example 7: The robotic monitoring system of any of Examples 1-6, wherein the payload subsystem is further configured for mounting, to the robotic monitoring system, at least one of (1) a light source, (2) an audio speaker, or (3) a display device.
Example 8: The robotic monitoring system of any of Examples 1-7, further comprising a user and payload interface subsystem that includes a mechanical interface for mounting an object to the robotic monitoring system.
Example 9: The robotic monitoring system of any of Examples 1-8, further comprising a user and payload interface subsystem that includes an electrical interface for providing at least one of electrical communications or electrical power to a device mounted to the robotic monitoring system.
Example 10: The robotic monitoring system of any of Examples 1-9, further comprising a rack dolly subsystem for moving at least one server rack from one location to another location within the datacenter.
Example 11: The robotic monitoring system of any of Examples 1-10, further comprising a robotic arm for modifying at least one hardware component located within the datacenter.
Example 12: The robotic monitoring system of any of Examples 1-11, wherein (1) the at least one senor is further configured to obtain identification credentials from personnel detected within the datacenter and (2) the computation and navigation subsystem comprises (A) a facial recognition interface for (I) obtaining image data representative of the personnel detected within the datacenter and (II) determining suspected identities of the personnel detected within the datacenter based at least in part on the image data and (B) a security interface for (I) comparing the identification credentials obtained from the personnel to the suspected identities of the personnel and (II) determining, based at least in part on the comparison, whether the identification credentials from the personnel correspond to the suspected identifies of the personnel.
Example 13: The robotic monitoring system of any of Examples 1-12, wherein the computation and navigation subsystem is further configured to (1) obtain the information about the datacenter from the at least one sensor and (2) detect at least one security event within the datacenter based at least in part on the information about the datacenter, and further comprising a transmission subsystem for transmitting a notification about the security event to one or more personnel at the datacenter.
Example 14: A datacenter monitoring system comprising (1) mobile data-collection robots deployed within a datacenter, wherein the mobile data-collection robots include (A) a mobility subsystem for moving the mobile data-collection robots through the datacenter, (B) at least one sensor for sensing information about the datacenter as the mobile data-collection robots move through the datacenter, (3) a payload subsystem for mounting the at least one sensor to the mobile data-collection robots, and (5) a computation and navigation subsystem for recording the information about the datacenter and controlling the mobility subsystem, and (2) a data integration system communicatively coupled to the mobile data-collection robots, wherein the data integration system is configured to integrate the information about the datacenter as collected by the mobile data-collection robots while moving through the datacenter.
Example 15: The datacenter monitoring system of Example 14, wherein the at least one sensor comprises at least one of (1) a radio-frequency identification sensor, (2) a video camera, (3) an infrared camera, (4) an audio microphone, (5) a pressure sensor, or (6) a liquid sensor.
Example 16: The datacenter monitoring system of any of Examples 14 and 15, wherein the at least one sensor comprises a radio-frequency identification sensor configured to sense temperature information from one or more radio-frequency identification tags mounted in the datacenter.
Example 17: The datacenter monitoring system of any of Examples 14-16, wherein the computation and navigation subsystem comprises a heat map generator configured to generate, based at least in part on temperatures identified within the temperature information, a heat map corresponding to at least a portion of the datacenter.
Example 18: The datacenter monitoring system of any of Examples 14-17, wherein the at least one sensor comprises a radio-frequency identification sensor configured to sense asset-tracking information from one or more radio-frequency identification tags mounted in the datacenter.
Example 19: The datacenter monitoring system of any of Examples 14-18, wherein the mobile data-collection robots further include a transmission subsystem for transmitting the information about the datacenter to the data integration system.
Example 20: A method comprising (1) deploying mobile data-collection robots within a datacenter such that the mobile data-collection robots (A) collect information about the datacenter via at least one sensor as the mobile data-collection robots move through the datacenter and (B) transmit the information about the datacenter to a data integration system, (2) analyzing the information about the datacenter at the data integration system, (3) identifying at least one suspicious issue that needs attention within the datacenter based at least in part on the analysis of the information, and then in response to identifying the at least one suspicious issue, (4) performing at least one action directed to addressing the at least one suspicious issue.
As detailed above, the computing devices and systems described and/or illustrated herein broadly represent any type or form of computing device or system capable of executing computer-readable instructions, such as those contained within the modules described herein. In their most basic configuration, these computing device(s) may each include at least one memory device and at least one physical processor.
In some examples, the term “memory device” generally refers to any type or form of volatile or non-volatile storage device or medium capable of storing data and/or computer-readable instructions. In one example, a memory device may store, load, and/or maintain one or more of the modules described herein. Examples of memory devices include, without limitation, Random Access Memory (RAM), Read Only Memory (ROM), flash memory, Hard Disk Drives (HDDs), Solid-State Drives (SSDs), optical disk drives, caches, variations or combinations of one or more of the same, or any other suitable storage memory.
In some examples, the term “physical processor” generally refers to any type or form of hardware-implemented processing unit capable of interpreting and/or executing computer-readable instructions. In one example, a physical processor may access and/or modify one or more modules stored in the above-described memory device. Examples of physical processors include, without limitation, microprocessors, microcontrollers, Central Processing Units (CPUs), Field-Programmable Gate Arrays (FPGAs) that implement softcore processors, Application-Specific Integrated Circuits (ASICs), portions of one or more of the same, variations or combinations of one or more of the same, or any other suitable physical processor.
Although illustrated as separate elements, the modules described and/or illustrated herein may represent portions of a single module or application. In addition, in certain embodiments one or more of these modules may represent one or more software applications or programs that, when executed by a computing device, may cause the computing device to perform one or more tasks. For example, one or more of the modules described and/or illustrated herein may represent modules stored and configured to run on one or more of the computing devices or systems described and/or illustrated herein. One or more of these modules may also represent all or portions of one or more special-purpose computers configured to perform one or more tasks.
In addition, one or more of the modules described herein may transform data, physical devices, and/or representations of physical devices from one form to another. One or more of the modules recited herein may transform a processor, volatile memory, non-volatile memory, and/or any other portion of a physical computing device from one form to another by executing on the computing device, storing data on the computing device, and/or otherwise interacting with the computing device.
In some embodiments, the term “computer-readable medium” generally refers to any form of device, carrier, or medium capable of storing or carrying computer-readable instructions. Examples of computer-readable media include, without limitation, transmission-type media, such as carrier waves, and non-transitory-type media, such as magnetic-storage media (e.g., hard disk drives, tape drives, and floppy disks), optical-storage media (e.g., Compact Disks (CDs), Digital Video Disks (DVDs), and BLU-RAY disks), electronic-storage media (e.g., solid-state drives and flash media), and other distribution systems.
The process parameters and sequence of the steps described and/or illustrated herein are given by way of example only and can be varied as desired. For example, while the steps illustrated and/or described herein may be shown or discussed in a particular order, these steps do not necessarily need to be performed in the order illustrated or discussed. The various exemplary methods described and/or illustrated herein may also omit one or more of the steps described or illustrated herein or include additional steps in addition to those disclosed.
The preceding description has been provided to enable others skilled in the art to best utilize various aspects of the exemplary embodiments disclosed herein. This exemplary description is not intended to be exhaustive or to be limited to any precise form disclosed. Many modifications and variations are possible without departing from the spirit and scope of the present disclosure. The embodiments disclosed herein should be considered in all respects illustrative and not restrictive. Reference should be made to the appended claims and their equivalents in determining the scope of the present disclosure.
Unless otherwise noted, the terms “connected to” and “coupled to” (and their derivatives), as used in the specification and claims, are to be construed as permitting both direct and indirect (i.e., via other elements or components) connection. In addition, the terms “a” or “an,” as used in the specification and claims, are to be construed as meaning “at least one of.” Finally, for ease of use, the terms “including” and “having” (and their derivatives), as used in the specification and claims, are interchangeable with and have the same meaning as the word “comprising.”
Number | Date | Country | |
---|---|---|---|
Parent | 62883629 | Aug 2019 | US |
Child | 16986652 | US |