HEAD AND NECK ASSEMBLY OF A HUMANOID ROBOT

Information

  • Patent Application
  • 20250187202
  • Publication Number
    20250187202
  • Date Filed
    January 22, 2025
    5 months ago
  • Date Published
    June 12, 2025
    a month ago
Abstract
A humanoid robot includes an upper region includes a head and neck assembly having a neck portion and a head portion coupled to the neck portion. Said head portion includes: a frontal shell having a rear edge, a rear shell having a frontal edge, and an electronics assembly. The electronics assembly includes various components and devices used in the operation of the humanoid robot.
Description
TECHNICAL FIELD

This disclosure relates to a head of a robot, specifically a head of a humanoid robot. The head of the humanoid robot includes a plurality of components configured to provide the robot with the ability to communicate with nearby humans using a display that is protected by a frontal shell.


BACKGROUND

The current labor market within the United States is confronting an unprecedented labor shortage, characterized by over 10 million unfilled positions. A significant proportion of these vacancies pertain to occupations that are deemed unsafe, undesirable, or involve hazardous working conditions. This persistent and escalating shortage of available labor has created an urgent imperative for the development and deployment of advanced robotic systems capable of performing tasks that are unattractive or pose risks to human workers. To effectively address this widening gap in the workforce, it has become critical to design and engineer robots that can operate with high efficiency and reliability within human-centric environments. These environments often demand capabilities such as physical dexterity, sustained endurance, precise manipulation, and the ability to navigate complex spaces designed for humans.


Advanced general-purpose humanoid robots have emerged as a promising solution to meet these challenges. These robots are meticulously engineered to replicate the human form and emulate human functionality, typically featuring bipedal locomotion with two legs, bilateral manipulation abilities with two arms, and a display to facilitate interaction with human users. The anthropomorphic design enables these robots to seamlessly integrate into environments originally designed for humans, thereby minimizing the need for extensive modifications to existing infrastructures. As these robots endeavor to mimic the human body, it becomes essential to equip them with a head design that not only meets functional requirements but also enhances aesthetic appeal and durability. The head is a critical component for human-robot interaction, serving as the primary interface through which the robot communicates and engages with nearby humans. A well-designed head can significantly improve the robot's ability to convey information, express intentions, and respond to human cues, thereby fostering a more intuitive and natural interaction experience.


To meet these requirements, the present disclosure introduces an innovative head design that incorporates a versatile display system. This display is capable of adapting its visual output to suit a wide range of operational tasks by rendering icons, graphics, expressive animations, and informative text. The adaptability of the display allows the robot to present contextually relevant information and provide visual feedback, all of which enhance the robot's ability to interact effectively with human users. By making the robot's appearance more relatable and intuitive, the display fosters improved engagement and facilitates smoother human-robot collaboration.


Considering the sensitive and fragile nature of display technologies, and acknowledging the often challenging and harsh environments in which humanoid robots are deployed, it is advantageous to position the display behind a protective shield. This strategic placement serves multiple purposes. Firstly, the shield safeguards the display from potential contaminants such as dust, moisture, chemicals, and particulate matter that could adversely affect its performance and longevity. Secondly, the shield provides protection against physical impacts, vibrations, and mechanical stresses that may occur during operation, especially in industrial or outdoor settings. By mitigating the risks of damage to the display, the shield contributes to the overall robustness and reliability of the robot. Moreover, the integration of the display behind a shield contributes to a sleek and futuristic aesthetic, enhancing the robot's visual appeal.


In summary, the disclosed head design addresses the critical need for a durable, adaptable, and aesthetically pleasing interface for a general-purpose humanoid robot. By combining a versatile display with the frontal shell, the design ensures that the robot can effectively communicate and interact with humans while withstanding the rigors of diverse operational environments. This innovation not only enhances the functionality and user experience but also extends the operational lifespan of the robot, thereby providing a more sustainable and cost-effective solution for addressing the current labor market challenges.


SUMMARY

A need exists for a humanoid robot with an upper region including: (i) a torso, (ii) a pair of arm assemblies coupled to the torso, and (iii) a head and neck assembly coupled to the torso and having a neck portion and a head portion coupled to the neck portion. Said head portion includes: a frontal shell having a rear edge, a rear shell having a frontal edge, and an illumination assembly. The illumination assembly is configured to illuminate a region that: (i) extends between a rear edge of the frontal shell and an extent of the frontal edge of the rear shell, (ii) is positioned adjacent to the extent of the rear edge of the frontal shell, and (iii) is positioned adjacent to the extent of the frontal edge of the rear shell. The humanoid robot also includes: (i) a central region coupled to the upper region, and (ii) a lower region coupled to the central region and spaced apart from the upper region, the lower region including a pair of legs.


There is also a need for a humanoid robot with an upper region including: (i) a torso, (ii) a pair of arm assemblies coupled to the torso, and (iii) a head and neck assembly coupled to the torso and having a neck portion and a head portion coupled to the neck portion. Said head portion includes a frontal shell having a curvilinear periphery, and an outer surface having a nasal region and an orbital region that is not recessed in comparison to said nasal region. The head portion also includes an illumination assembly configured to emit light in a location that is adjacent to the periphery of the frontal shell. Finally, the humanoid robot also includes: (i) a central region coupled to the upper region, and (ii) a lower region coupled to the central region and spaced apart from the upper region, the lower region including a pair of legs.





BRIEF DESCRIPTION OF THE DRAWINGS

The drawing figures depict one or more implementations in accord with the present teachings, by way of example only, not by way of limitation. In the figures, like reference numerals refer to the same or similar elements shown across various other figures.



FIG. 1 is a front perspective view of a humanoid robot in an upright, standing position P1 and including: (i) an upper region having the following parts: (a) a head and neck assembly, (b) a torso, (c) left and right shoulders, (d) and left and right arm assemblies each including: (c) a humerus, (f) a forearm, (g) a wrist, and (h) a hand; (ii) a lower region having left and right leg assemblies each including: (a) a thigh, (b) a knee, (c) a shin, (d) an ankle, and (c) a foot; and (iii) a central region connecting the upper portion and the lower portion to one another and configured to allow movement of the upper and lower regions relative to one another;



FIG. 2 is a perspective view of the head and neck assembly of the humanoid robot of FIG. 1 showing the head and neck assembly including: (i) a head portion having: (a) a head housing assembly with a frontal shell and a rear shell, (b) a head electronics assembly with an illumination assembly having at least one light emitter configured to emit light between an extent of the frontal shell and rear shell, and (ii) a neck portion having: (a) a neck housing assembly with a cover, and (b) a neck electronics assembly;



FIG. 3 is a top view of the head and neck assembly of FIG. 1 showing portions of the head electronics assembly including three cameras located within a head space formed by the head housing assembly and each camera oriented to face in a different direction from one another to provide the humanoid robot with a viewing range of at least 180 degrees around the humanoid robot;



FIG. 4 is a front view of the head and neck assembly of FIG. 3 showing the head housing assembly including a rear shell and a frontal shell coupled to the rear shell, and showing a plurality of light emitters of the illumination assembly located at a junction between the frontal shell and the rear shell and the cameras located behind the frontal shell showing: (i) a first camera facing in a forward direction through the frontal shell, (ii) a second camera facing in a first lateral direction through the frontal shell, and (iii) a third camera facing in an opposed second lateral direction through the frontal shell;



FIG. 5A is a side view of the head and neck assembly of FIG. 3 showing the first and second cameras within the head space and facing directions orthogonal to one another;



FIG. 5B is a side view of the head and neck assembly of FIG. 3 showing various anatomical regions of the head;



FIG. 6 is a rear view of the head and neck assembly of FIG. 3;



FIG. 7A is a partial perspective view of the head and neck assembly of FIG. 3 showing a first set of information displayed on and/or through the frontal shell;



FIG. 7B is a front view of the head of the robot showing an icon indicating a status on and/or through the frontal shell;



FIG. 7C is a front view of the head of the robot showing another icon indicating an alert or system failure event on an/or through the frontal shell;



FIG. 7D is a front view of the head of the robot showing an icon indicating a particular mode of the robot on an/or through the frontal shell;



FIG. 8 is a perspective view of a second embodiment of a humanoid robot in an upright standing position P1 with outer covers of the robot removed to expose various actuators and components included in the robot;



FIG. 9 is a front view of the head and neck assembly of FIG. 8 showing the electronics assembly including: (i) a display located behind the frontal shell, (ii) a plurality of light emitters of the illumination assembly that are positioned to emit light on the lateral sides of the head and adjacent to a rear edge of the frontal shell, and (iii) a sensory assembly having at least one of the cameras mounted to an electronics support included in the head housing assembly and located above the display;



FIG. 10A is a side view of the head and neck assembly of FIG. 9;



FIG. 10B is a side view of the head and neck assembly of FIG. 9 showing various anatomical regions of the head and a field of view of the camera included in the sensor assembly;



FIG. 11 is an exploded assembly view of the head and neck assembly of FIG. 9;



FIG. 12 is a top view of the head and neck assembly of FIG. 9, showing the sensor assembly including three cameras having overlapping fields of view oriented in a forward direction of the robot;



FIG. 13 is a perspective view of the display included in the head and neck assembly of FIG. 9;



FIG. 14 is a side view of the display included in the head and neck assembly of FIG. 9;



FIG. 15 is a top view of the display included in the head and neck assembly of FIG. 9;



FIG. 16 is a rear perspective view of a second embodiment of a head and neck assembly of the humanoid robot of FIG. 1;



FIG. 17 is a front perspective view of the head and neck assembly of FIG. 28, wherein the head assembly further includes an upper shell with an upper recessed sensor zone;



FIG. 18 is a front view of the head and neck assembly of FIG. 16;



FIG. 19 is a side view of the head and neck assembly of FIG. 16;



FIG. 20 is a top view of the head and neck assembly of FIG. 16 showing various fields of view of cameras included in the electronics assembly including three cameras facing in a forward direction of the robot and a single camera facing in a rearward direction of the robot;



FIG. 21 is a side view of the head and neck assembly of FIG. 16 showing various anatomical regions of the head and the field of view of the cameras located relative to the anatomical regions of the head;



FIG. 22 is a perspective view of a humanoid robot including a third embodiment of head and neck assembly of the humanoid robot, wherein the head and neck assembly includes a deformable mesh covering internal structures and components of the neck assembly;



FIG. 23 is a lower perspective view of the third embodiment of the head and neck assembly of FIG. 22, wherein the head and neck assembly includes: (i) a head portion having: (a) a head housing assembly with a frontal shell, a rear shell, and an upper shell with an upper recessed sensor zone, (b) a head electronics assembly with an illumination assembly having at least one light emitter configured to emit light between an extent of the frontal shell and rear shell, and (ii) a neck portion having: (a) a neck housing assembly with a deformable cover and a lower recessed sensor zone, and (b) a neck electronics assembly;



FIG. 24 is a bottom perspective view of the head and neck assembly of FIG. 22;



FIG. 25 is a front view of the head and neck assembly of FIG. 22;



FIG. 26 is a side view of the head and neck assembly of FIG. 22;



FIG. 27 is a side view of the head and neck assembly of FIG. 22 showing various anatomical regions of the head;



FIG. 28 is a cross section of the head and neck assembly of FIG. 22 showing various compartments in the head housing portions of the electronics assembly;



FIG. 29 is another cross section of the head and neck assembly of FIG. 22;



FIG. 30 is an exploded assembly view of the head and neck assembly of FIG. 22;



FIG. 31 is a perspective view of the frontal shell of the head and neck assembly of FIG. 22, wherein the frontal shell includes: (i) a main body, (ii) a plurality of wings that extend upward from the main body, (iii) an outer periphery with a rear edge having a plurality of recesses formed therein;



FIG. 32 is a side view of the frontal shell of the head and neck assembly of FIG. 22 showing various dimensions of the frontal shell;



FIG. 33 is a cross-sectional view of the frontal shell of FIG. 22 and showing three different radii of curvatures;



FIG. 34 is a cross-sectional view of the frontal shell of FIG. 22 and showing three different radii of curvatures;



FIG. 35 is a front view of the frontal shell of the head and neck assembly of FIG. 22;



FIG. 36 is a cross-sectional view of the frontal shell of FIG. 22 showing three different radii of curvatures;



FIG. 37 is a cross-sectional view of the frontal shell of FIG. 22 and showing three different radii of curvatures;



FIG. 38 is a front view of the head and neck assembly of FIG. 22 with a cover of the neck assembly removed;



FIG. 39 is a cross section of the head and neck assembly of FIG. 38 taken along line 39-39 showing the electronics assembly including a sensor assembly having two cameras arranged to face in a forward direction of the robot;



FIG. 40 is a perspective view of the head and neck assembly of FIG. 22 with the head housing assembly removed to show the electronics assembly;



FIG. 41 is a front view of the head and neck assembly of FIG. 22 with the head housing assembly removed to show the electronics assembly;



FIG. 42 is a perspective view of the cameras included in the electronics assembly of the head and neck assembly of FIG. 22 showing the orientation of upper cameras relative to lower cameras;



FIG. 43 is a side view of the cameras of FIG. 42 showing distances and arc angles between the upper cameras and the lower cameras;



FIG. 44 is a side view of the head and neck assembly of FIG. 22 showing orientations and fields of view of the upper and lower cameras included in the electronics assembly;



FIG. 45 is an enlarged view of a portion of the head and neck assembly of FIG. 44 showing one of the illumination assemblies;



FIG. 46 is a cross section taken along line 46-46 in FIG. 38 showing the illumination assemblies mounted to lateral sides of the head to reside between the frontal shell and the rear shell;



FIG. 47 is an enlarged view of a portion of the head and neck assembly of FIG. 46 showing one of the illumination assemblies in detail;



FIG. 48 is a perspective view of the illumination assemblies;



FIG. 49 is a perspective view of a pair of illumination assemblies;



FIG. 50 is a perspective view of a third embodiment of a head and neck assembly, wherein the frontal shell is omitted from the housing assembly, the electronics frame further includes a plurality of support ribs, and the electronics assembly further includes a plurality of side displays coupled to the electronics frame;



FIG. 51 is a perspective view of a fourth embodiment of a head and neck assembly, wherein the illumination assembly includes a plurality of light emitters that are positioned adjacent to a majority of the rear edge of said frontal shield;



FIG. 52 is a perspective view of a fifth embodiment of a head and neck assembly, wherein the upper recessed sensor zone of the fifth embodiment has been omitted;



FIG. 53 is a side view of a sixth embodiment of a head and neck assembly wherein the rear shell has been omitted and the frontal shell encases a majority of the head, the head and neck assembly including a plurality of cameras contained within the head and facing outwardly through the frontal shell;



FIG. 54 is a top view of the head and neck assembly of FIG. 53 showing the cameras oriented relative to the head and showing a field of view of each camera;



FIG. 55 is a perspective view of a seventh embodiment of a head and neck assembly including rectangular shaped illumination assemblies oriented perpendicular to a rear facing edge of the frontal shell and a perimeter band extending around the frontal shell between rear facing edge of the frontal shell and the rear shell and having a perimeter illumination assembly contained therein;



FIG. 56 is a rear perspective view of the head and neck assembly of FIG. 55;



FIG. 57 is a perspective view of an eighth embodiment of a head and neck assembly including a frontal shell having a substantially constant width and extending from a chin region of the head to a crown region of the head, and showing a pair of illumination assemblies on each lateral side of the frontal shell;



FIG. 58 is a side view of a ninth embodiment of a head and neck assembly including a side electronics panel having a pair of illumination assemblies and a display;



FIG. 59 is a side view of a tenth embodiment of a head and neck assembly including a frontal shell extending over a majority of the head and a display located in a parotid region and/or a buccal region of the head;



FIG. 60 is a perspective view of the head and neck assembly of FIG. 59;



FIG. 61 is a side view of an eleventh embodiment of a head and neck assembly including an illumination assembly in a parotid and/or buccal region of the head;



FIG. 62 is a rear perspective view of the head and neck assembly of FIG. 61;



FIG. 63 is a perspective view of a twelfth embodiment of a head and neck assembly including a frontal shell, a rear shell having a side display, and a flexible top shell coupled to a rear end of the frontal shell and an upper end of the rear shell;



FIG. 64 is a side view of the head and neck assembly of FIG. 63;



FIG. 65A is a perspective view of a thirteenth embodiment of a head and neck assembly including an upper electronics assembly and a lower electronics assembly;



FIG. 65B is a front view of the head and neck assembly of FIG. 65A showing the upper and lower electronic assemblies including cameras;



FIG. 65C is a side view of the head and neck assembly of FIG. 65A showing the cameras of the upper electronic assembly facing in a forward direction and showing the cameras of the lower electronic assembly facing in a downward direction;



FIG. 66A is an enlarged view of a portion of a torso region of the humanoid robot of FIG. 1, wherein said torso includes a plurality of light emitters;



FIG. 66B is an enlarged view of a portion of a right thigh region of the humanoid robot of FIG. 1, wherein said thigh region includes at least one light emitter;



FIG. 66C is an enlarged view of a portion of a left thigh region of the humanoid robot of FIG. 1, wherein said thigh region includes at least one light emitter;



FIG. 67 is an enlarged view of a portion of the head and neck region of the humanoid robot of FIG. 6, wherein said neck includes at least one light emitter;



FIG. 68A is an enlarged view of a portion of a shoulder region of the humanoid robot of FIG. 22, wherein said shoulder region includes at least one light emitter;



FIG. 68B is an enlarged view of a portion of a wrist region of the humanoid robot of FIG. 22, wherein said wrist region includes at least one light emitter; and



FIG. 68C is an enlarged view of a portion of a knee region of the humanoid robot of FIG. 22, wherein said knee region includes at least one light emitter.





DETAILED DESCRIPTION

In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant teachings. However, it should be apparent to those skilled in the art that the present teachings may be practiced without such details. In other instances, well-known methods, procedures, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure.


While this disclosure includes several embodiments in many different forms, there is shown in the drawings and will herein be described in detail embodiments with the understanding that the present disclosure is to be considered as an exemplification of the principles of the disclosed methods and systems, and is not intended to limit the broad aspects of the disclosed concepts to the embodiments illustrated. As will be realized, the disclosed methods and systems are capable of other and different configurations and several details are capable of being modified all without departing from the scope of the disclosed methods and systems. For example, one or more of the following embodiments, in part or whole, may be combined consistent with the disclosed methods and systems. As such, one or more steps from the flow charts or components in the Figures may be selectively omitted and/or combined consistent with the disclosed methods and systems. Additionally, one or more steps from the flow charts or the method of assembling the shoulder and upper arm may be performed in a different order. Accordingly, the drawings, flow charts and detailed description are to be regarded as illustrative in nature, not restrictive or limiting.


A. Introduction

Unlike conventional robots and as described in greater detail below, the disclosed humanoid robot 100 includes an upper region 200 having a head and neck assembly 202. The head and neck assembly 202 is coupled to a torso 204 and has an overall shape that generally resembles a human head. As such, the head and neck assembly 201 includes a head portion (e.g. 202, 2202, etc.) that does not include large flat surfaces (e.g., opposed sides of a head, or is not in the shape of: (a) cube, (b) hexagonal prism, or (c) pentagonal prism). Instead, almost all the surfaces of the head are curvilinear or have substantial curvilinear aspects or segments. However, as shown in the Figures, some embodiments of the head do include a recess with a small flat sensor cover or lens, which is recessed in a top portion of the head and is designed to decrease sensor signal distortion that may be caused if the sensor signals are required to travel through a curvilinear shell, cover, shield, or lens. Additionally, while the overall head shape is designed to be human-like, the disclosed head lacks pronounced human facial structures (cheeks, eye light emitter housings, a mouth, or other moving structures).


The frontal region of the head is covered by a large freeform frontal shell, frontal head shell, or frontal shield (e.g. 228, 2228, etc.) wherein the curvature of the frontal shell varies horizontally and laterally across the head. The freeform nature of the frontal shell causes it to be separate and distinct from the display(s) that is positioned behind the shield. This positional relationship allows the frontal shell to protect the display and electronics contained in the head from damage, which provides a substantial benefit over conventional robot heads that lack this feature. For example, certain tasks (e.g., moving and cutting sheet metal) that the robot may perform on the factory floor may damage or break a display that is not protected behind a shield. As shown in the Figures, the frontal shell does not extend over the entirety of the upper head shell, behind an car region, nor does it extend into the rear region. However, the frontal shell extends to the chin region and, in some embodiments, includes a substantial opening or recess formed along the upper extent of the shield. The opening or recess formed along the upper edge allows for the inclusion of a small flat sensor cover or secondary lens. Due to the formation of this opening or recess, the shield includes two wing-shaped structures that extend upwards from a main body and surround lateral extents of the sensors positioned behind the small flat sensor cover.


Unlike conventional robot heads, the disclosed head includes a display (e.g. 300, 2300, etc.) that is preferably curved in a single direction, or at least one direction, and is positioned on an angle relative to the coronal plane and a horizontal reference plane. The curved nature of the display allows for the inclusion of a larger display with a larger surface area within the head, which increases the amount of information that can be displayed on the display. The larger display provides a benefit over conventional robot heads that lack this feature because those conventional robots must either forgo displaying as much information (while not altering the size of the information) or increase the size of their head (which causes a number of other issues, including increased material costs and assembly costs). Additionally, being able to display more information on the disclosed display is beneficial because the disclosed robot does not include any other internal displays. Further, including only a single display within the robot is beneficial because it: (i) reduces space needed for the displays, (ii) reduces battery usage of the displays, and (iii) at least reduces, and typically eliminates, the inclusion of fragile components within the robot. The display may be configured to display robot status, sensor data, and/or other relevant information to nearby human beings. However, the display is not configured to display human-like facial features (eyes, nose, mouth, etc.) or expressions, but instead is designed to use generic blocks or shapes.


Unlike conventional robot heads, in some embodiments, the disclosed head may include two separate sensor assemblies (e.g. 2272, 3272, etc.). The first sensor assembly may be positioned within the upper shell or robot's forehead region, while the second sensor assembly may be positioned within the neck assembly or robot's chin region. The position of the first sensor assembly: (i) enables a larger display to be utilized within the head, and (ii) allows the robot to see into a bin that is placed on a high shelf or rack. Including the second sensor assembly enables the robot to look and downward (e.g., to see what it is carrying or looking into a storage bin) without using the first sensor assembly. These are significantly beneficial features over conventional robots that lack a second sensor assembly because the conventional robots must bend and articulate their neck to a greater degree to obtain the data captured from the second sensor assembly. Also, neither sensor assembly in the disclosed head is positioned where a human's eyes would typically be located, above the crown of the head, nor on either side of the robot's head. It should be understood that: (i) both the first and second sensor assemblies may be omitted, (ii) the first or upper sensor assembly may be omitted, while the second or lower sensor assembly is retained, or (iii) the second or lower sensor assembly may be omitted, while the first or upper sensor assembly is retained.


The electronics assembly of the disclosed head portion may include an illumination assembly having at least one light emitter, and preferably a plurality of light emitting assemblies (e.g. 264, 2264, etc.) are positioned adjacent to a rear edge of the frontal shell. The light emitters enable the robot 100 to communicate with humans without using the display that is disposed behind the frontal shell, wherein said light emitters act as or are configured to act as an indicator light. Typically, the light emitters (and in this configuration, indicator lights) can communicate information about the humanoid robot 100 to nearby humans by: (i) emitting light having different wavelengths, wherein said emitted light may be perceived by a nearby human as having different color light, and/or (ii) utilize illumination sequences, durations, and/or brightness. For example, the indicator lights may be used to communicate the working state (e.g., yellow—600 nm), idle state (e.g., green—550 nm), charging state (e.g., blinking or white), error state (e.g., red—665 nm), thinking (e.g., blue—470 nm), or other general states. This is beneficial because it can limit the information that needs to be displayed on the display and allows a human, robot, or machine to receive information from the robot, when the human, robot, or machine is directly to one side of the robot (where the human, robot, or machine could not see the display). Also, the light emitters use less battery power than the display and may be able to relay information more quickly to the human, robot, or machine. Alternatively, the indicator lights can signal an operator to immediately take note of a more complex condition or information that is comprehensively displayed on the display to ensure that an operator properly assesses that complex condition or information for the humanoid robot 100. It should be understood that in other embodiments, the illumination assembly may: (i) emit a light that surrounds the periphery of the frontal shell, (ii) emit a light that surrounds the rear edge of the frontal shell, (iii) include a one or more emitter positioned in other robot parts (e.g., torso, knee, leg, arm, hand etc.).


B. Robot Architecture

The humanoid robot 100 is designed to have a substantial similarities in form factor and anatomy to human beings including many of the same major appendages that human beings have. The humanoid robot 100 includes an upper region 200, a lower region 400 spaced apart from the upper region 200, and a central region 600 interconnecting the upper region 200 and the lower region 400. The humanoid robot 100 is shown in FIGS. 1 and 2 in an upright, standing position P1 where a pair of feet 410a, 410b of the lower region 400 are standing on a floor or ground surface G such that the lower region 400 supports the upper region 200 and the central region 600 above the floor.


The upper region 200 includes the following parts: (a) a head and neck assembly 202, (b) a torso 204, (c) left and right shoulders 206a, 206b, (d) and left and right arm assemblies 208a, 208b each including: (c) a humerus 210a, 210b, (f) a forearm 212a, 212b, (g) a wrist 214a, 214b, and (h) a hand 216a, 216b. The lower region 400 includes left and right leg assemblies 402a, 402b each including: (a) a thigh 404a, 404b, (b) a knee 406a, 406b, (c) a shin 408a, 408b, (d) an ankle 410a, 410b, and (c) a foot 412a, 412b. The central region 600 is located generally in, or provides, a pelvis region 601 of the humanoid robot 100. Each of the components of the upper region 200 and the lower region 400 noted above includes at least one actuator configured to move the components relative to one another. The central region 600 is also configured to allow movement of the upper and lower regions 200, 400 relative to one another in a three-dimensional manner.


C. Head and Neck Assembly

As shown in FIG. 1, it can be seen that the head and neck assembly 202 of the robot 100 extends from the torso 204 and is designed to: (i) provide the robot 100 with a humanoid shape, (ii) house and protect one or more electronic components (e.g., display, light emitters, sensors, and/or the head actuators), and (iii) provide information to a nearby human using a display and/or indicator lights. As shown in FIGS. 7-30, the head and neck assembly 202 includes: (i) a head portion 202a, and (ii) a neck portion 202b coupled to a lower end or extent of said head portion 202a.


1. Exterior Head Shape

As shown in FIGS. 1-9, the head portion 202a of the robot 100 has one or more exterior surfaces 325, 329 that provides said head portion 202a with an overall shape that is similar to a shape of a human head. The overall shape of the head portion 202a is defined by the housing assembly 220 that includes: (i) a rear shell, rear head covering, or rear cover 234, and (ii) a frontal shell, frontal shield, frontal head covering, or frontal cover 228. In some embodiments, the head portion 202a is formed with no flat surfaces and is generally egg-shaped when viewed from the front, as shown in FIG. 4, and the top, as shown in FIG. 7. The head portion 202a of the robot 100 changes constantly in width from top to bottom, wherein the width of the head portion 202a increases from a top or scalp end 235 to a temple region 350 where the head is widest. The temple region 350 generally corresponds to an eye level of a human, or at a location that is about 30-50% of a height 240 of the head portion 202a from the top end 235. The width of the head portion 202a then decreases from the temple region 350 to a lower or chin end 239. In this way, the head portion 202a of the robot 100 is asymmetrical about a first plane 102 passing through a center or centroid C of the head portion 202a equidistant from the top end 235 and the lower end 239. The head portion 202a of the robot 100 is symmetrical about a second plane 104 perpendicular to the first plane 102 and passing through the center or centroid C of the head portion 202a. In other embodiments, the head may be symmetrical about a first plane 102 and asymmetrical about a second plane 104.


As shown in FIG. 4, a rear shell 234 of the housing assembly 220 has a first maximum height 240 and a frontal shell 228 of the housing assembly 220 has a second maximum height 242 less than the first height 240. The rear shell 234 also has a first maximum width 244 and the frontal shell 228 has a second maximum width 246 less than the first width 244. The head portion 202a of the robot 100 has the maximum height 240 that is greater than the maximum width 244. The maximum height 240 and width 244 are both provided by the rear shell 234, in particular the rear shell 234 of the rear shell 234. The maximum width 246 of the frontal shell 228 is located above the maximum width 244 of the of the head portion 202a.


As shown in FIG. 5A, a depth of the head of the robot 100 is defined by a combination of both the rear shell 234 and the frontal shell 228. The depth includes a maximum depth 250 at a location that is about equal to the temple region 350 and that extends between a front or facial region 241 of the head to an occipital region 359 of the head. The front end 241 is provided by the frontal shell 228 and the rear end 243 is provided by the rear shell 234. The depth of the head changes constantly from the top end 235 to the lower end 239. The depth increases from the top end 235 to the maximum depth 250 and then decreases from the maximum depth 250 to the lower end 239. The head of the robot 100 is asymmetrical about a third plane 106 extending through the center or centroid C of the head and perpendicular to the first plane 102. The head of the robot 100 is also asymmetrical about a fourth plane 108 extending through the center or centroid C of the head and perpendicular to the second plane 104. In other words, the head of the robot 100 is only symmetrical about the second plane 104. The center C is spaced equal distances from: (i) the top end 235 and the bottom end 239, (ii) the front end 241 and the rear end 243, and (iii) lateral sides 247, 249 of the head portion 202a. In other embodiments, the head may be symmetrical about a third plane 106 and/or symmetrical about a fourth plane 108. Stated another way, other embodiments of the head may be symmetrical about: (i) all planes 102, 104, 106, and 108, (ii) three of the four planes 102, 104, 106, and 108, (iii) two of the four planes 102, 104, 106, and 108, (iv) one of the four planes 102, 104, 106, and 108, or (v) none of the four planes 102, 104, 106, and 108.


The exterior surfaces 325, 329 of both the rear shell 234 and the frontal shell 228 are designed with a concave shape relative to the center of the head portion 202a. This concave configuration contributes to the overall streamlined and ergonomic form of the robot's head portion 202a. In contrast, the nape region 245, located at the rear of the head below the occipital region 359 and adjacent to the neck portion 202b, features a unique convex exterior surface 349. This convex surface 349 is oriented outward relative to the center C of the head portion 202a, creating a subtle protrusion that mimics the natural curvature found in human anatomy. Notably, the nape region 245 may stand as the sole area of the head portion 202a exhibiting this convex characteristic. This deliberate design choice not only enhances the anthropomorphic appearance of the robot 100 but also potentially serves functional purposes such as housing specific components or facilitating the connection between the head and neck assembly 202. In other embodiments, the exterior surfaces 325, 329 of both the rear shell 234 and the frontal shell 228 may: (i) include or incorporate ridges, channels, or textured patterns to enhance heat dissipation, improve structural rigidity, or serve as mounting points for additional sensors or components, (ii) be modular component bay that is designed to allow for access to internal components, and/or (iii) have larger flat or substantially flat surfaces. Additionally, the nape region 245 and/or other aspects or regions of the rear shell 234 may not be convex and instead it may be linear, substantially linear, concave, curvilinear, straight, angled, arc-shaped, wave-shaped, parabolic, elliptical, cylindrical, tapered, segmented linear, multilinear, undulating, hyperbolic, nonlinear, lobed, irregularly curved, bowed, U-shaped, V-shaped, crescent-shaped, radial, spiral, rectilinear, polygonal, triangular curve, circular arc, inflection curve, inclined linear, fractal-like curve, disjointed linear, hyperbolic arc, S-shaped, compound linear, and/or any combination thereof.


When viewed from the front as shown in FIGS. 4 and 9, (i) the lateral sides of the head portion 202a may have a first curvature 341 generally extending about the center or centroid C, (ii) the top extent 235 of the head portion 202a may have a second curvature 343 generally extending about the center or centroid C, and (iii) the bottom end 239 may have a third curvature 345 generally extending about the center or centroid C. In some aspects, the second curvature 343 may be less than, or have a lesser degree than, the first curvature 341. The second curvature 343 may be greater than, or have a greater degree than, the third curvature 345 in some cases. The first curvature 341 may be defined between a buccal region 352 and a crown region 270 of the head 202a. The second curvature 343 may be defined by the crown region 270. The third curvature 345 may be defined between each buccal region 352 of the head 202a. In some implementations, the head 202a may have a substantially oval shape when viewed from above as shown in FIG. 3, and may taper inwardly slightly in the frontal shell 228 toward a facial region 347 provided by the frontal shell 228 in some cases. This design enables the head portion 202a to have varying degrees of curvature in different regions to optimize visibility, sensor placement, or aesthetics. For example, the curvature of the frontal shell 228 could be more pronounced in the facial region 347 to accommodate a larger display 300, while tapering to a shallower curve towards the rear edge 322 of the frontal shell 228. In other embodiments, the head portion 202a may have angular geometries like hexagonal, octagonal, pentagonal, triangular, square, rectangular, trapezoidal, rhomboidal, parallelogram-shaped, diamond-shaped, oval, elliptical, circular, semicircular, crescent-shaped, star-shaped, heart-shaped, teardrop-shaped, or may have surfaces/edges that are linear, substantially linear, concave, curvilinear, straight, angled, arc-shaped, wave-shaped, parabolic, elliptical, cylindrical, tapered, segmented linear, multilinear, undulating, hyperbolic, nonlinear, lobed, irregularly curved, bowed, U-shaped, V-shaped, crescent-shaped, radial, spiral, rectilinear, polygonal, triangular curve, circular arc, inflection curve, inclined linear, fractal-like curve, disjointed linear, hyperbolic arc, S-shaped, compound linear, and/or any combination thereof.


2. Head Housing Assembly

The head housing assembly 220 of the head and neck assembly 202 is configured to contain and protect other assemblies contained within the head assembly 202a. As discussed above, the housing assembly 220 is configured to have a form resembling the general shape of a human head and includes: (i) the frontal shell, frontal shield, frontal head covering, or frontal cover 228, (ii) the rear shell, rear head covering, or rear cover 234, (iii) an intermediate cover, intermediate support, or intermediate member 252, and (iv) an electronics support or frame 254. As discussed below, the intermediate cover 252 and electronics support 254 may be combined into a single structure. Additionally, in other embodiments, the intermediate cover 252 may be omitted and the electronics support 254 may be directly coupled to an extent of the rear shell 234. In further embodiments, electronics support 254 may be omitted and the intermediate cover 252 may be retained. Also, the rear shell 234 may be omitted or substantially omitted and replaced by a substantially larger frontal shell 228. Moreover, the frontal shell 228 may be omitted or substantially omitted and replaced by a substantially larger rear shell 234 frontal shell 228. Finally, the intermediate cover 252 and electronics support 254 may be integrally formed as a single component or said components may be integrally formed with one or more of the shield 228 or shell 234.


The intermediate cover 252 and the rear shell 234 mount to one another and define a first head sub-volume 236 within the housing assembly 220. The first head sub-volume 236 is configured to contain and protect one or more components used in the operation of the robot 100 such as electronics, batteries, computing components, etc. The frontal shell 228 provides a front end of the housing assembly 220 and defines a second sub-volume 238 between the intermediate cover 252 and the frontal shell 228 within the housing assembly 220. The second sub-volume 238 is separated from the first sub-volume 236 via the intermediate cover 252 and is configured to contain and protect one or more components included in the electronics assembly 222 such as a display, light emitters, cameras etc. The frontal shell 228 and/or the intermediate cover 252 can be removed from the rest of the housing assembly 220 to service components within the sub-volumes 236, 238 or to upgrade components in the sub-volumes 236, 238. The modular design allows for individual components to be replaced without requiring replacement of the entire housing.


i. Intermediate Cover


The intermediate cover 252 includes structures that are used to mount components of the electronics assembly 222 to the head portion 202a. The intermediate cover 252 is configured to couple with the rear shell 234 and is located between the first and second sub-volumes 236, 238 to separate the first and second sub-volumes 236, 238. In other words, the intermediate cover 252 is designed to split or divide the first sub-volume 236 from the second sub-volume 238. In other embodiments, the intermediate cover 252 may be omitted, and the first and second sub-volumes 236, 238 may be converted into a single sub-volume. Alternatively, the intermediate cover 252 may be combined or integrally formed with other structures disclosed herein (e.g., electronics support 254, the rear shell 234, and/or the frontal shell 228), whereby the first and second sub-volumes 236, 238 may remain or may be combined into a single sub-volume. Further, it should be understood that other mounting structures, dividers, covers, and/or plates may be included within the head to further sub-divide the housing into additional sub-volumes (e.g., 3-10 sub-volumes).


The intermediate cover 252 has an outer perimeter 256 that is sized to fit within an inset rim 258 of the rear shell 234. In this manner, the outer perimeter 256 is slightly less than the inner perimeter 251 of an extent of the rear shell 234 that is positioned between a ledge 257 and a forward edge 253 of an outer perimeter 259 of the rear shell 234. As such, the outer perimeter 256 of the intermediate cover 252 has a length that is less than the length of the outer perimeter 259 of the rear shell 234. In other embodiments, the outer perimeter 256 of the intermediate cover 252 may not be sized to fit within an inset rim 258 of the rear shell 234. Instead, said outer perimeter 256 of the intermediate cover 252 may be substantially equal to the outer perimeter 259 of the rear shell 234, wherein said intermediate cover 252 may be coupled to, positioned adjacent to, and/or abutting said forward edge 253 of the rear shell 234. In other words, a rear extent of the intermediate cover 252 may be configured to abut the forwardmost surface of said forward edge 253 of the rear shell 234. In other embodiments, the outer perimeter 256 of the intermediate cover 252 may only extend along an extent that is less than substantially all, or even less than a majority (e.g., along only a minority), of the inner perimeter 251 of an extent of the rear shell 234 that is positioned between a ledge 257 and a forward edge 253 of the outer perimeter 259.


The outer perimeter 256 of the intermediate cover 252 may be slightly less than the outer perimeter 260 of the frontal shell 228. As such, the outer perimeter 256 of the intermediate cover 252 may have a length that is less than the length of the outer perimeter 260 of the frontal shell 228. The difference in length between the two perimeters may facilitate case of assembly by allowing the intermediate cover 252 to be more easily positioned within the frontal shell 228 without interference. In some aspects, the difference in perimeter dimensions may account for variations in material expansion, tolerances in manufacturing processes, or specific scaling requirements to ensure a secure fit between the components. In some aspects, the outer perimeter 256 of the intermediate cover 252 may be substantially equal to the outer perimeter 260 of the frontal shell 228. In other embodiments, the outer perimeter 256 of the intermediate cover 252 may extend along only a portion of the rear edge 322 of the frontal shell 228, such as less than 75%, less than 50%, or less than 25% of the rear edge 322. The selective extension of the outer perimeter 256 may be implemented to accommodate specific functional or design considerations, such as providing access to internal components, creating ventilation pathways, or facilitating maintenance procedures. For example, an intermediate cover 252 that extends along only 50% of the rear edge 322 may allow for partial disassembly without the need to remove the entire cover, thus enhancing serviceability. The intermediate cover 252 may also have an outer perimeter 256 that is larger than the outer perimeter 260 of the frontal shell 228 in some implementations. This configuration may be utilized in cases where the intermediate cover 252 is intended to provide additional protective overhang or where the intermediate cover includes integrated features such as gaskets, flanges, or attachment points that extend beyond the frontal shell 228. For instance, an enlarged intermediate cover 252 may help prevent ingress of dust or moisture, especially in environments where the device is exposed to harsh conditions. Additionally, the intermediate cover 252 may have an irregular or non-circular outer perimeter 256 that corresponds to specific internal component layouts or mounting requirements. The shape of the perimeter may be dictated by the need to accommodate uniquely shaped components, such as lights, sensors, wiring harnesses, or power modules. In certain aspects, the intermediate cover 252 may comprise multiple separate sections with individual outer perimeters that collectively form the overall outer perimeter 256. This modular approach may allow for more flexible assembly and disassembly processes, where only certain sections of the intermediate cover need to be removed to access specific internal components. Furthermore, the use of multiple sections may enable customization of the cover to suit different use cases or to accommodate future upgrades.


The intermediate cover 252 further includes a plurality of light emitter housings 262a, 262b, 262c, 262d spaced around the outer perimeter 256 of the intermediate cover 252. Specifically, each of the light emitter housings 262a, 262b, 262c, 262d may be configured to house a respective light emitting assembly 264a, 264b, 264c, 264d of the illumination assembly 223. As best shown in FIGS. 9 and 11, the light emitter housings 262a, 262b, 262c, 262d have five primary walls that are comprised of: (i) two end walls 376, 378, wherein said end walls 376, 378 are angled (e.g., at an obtuse angle) relative to a frontal surface 252F of the intermediate cover 252, (ii) a top wall 380, (iii) a bottom wall 382, wherein said bottom wall 382 is angled relative to the frontal surface 252F of the intermediate cover 252, and (iv) an interior wall 384, wherein said interior wall 384 is angled (e.g., at an obtuse angle) relative to the frontal surface 252F of the intermediate cover 252. This wall assembly primary forms a trapezoid body, comprised of opposed end walls 376, 378, a top wall 380 with an exterior edge and an internal edge 380a, and with an sloped interior wall 384 that extends inward and towards the center of the head from the internal edge 380a. The angled configuration of the two end walls 376, 378, bottom wall 382, and interior wall 384 is designed to: (i) force light out of the housing 262a, 262b, 262c, 262d, and (ii) ensure that the light emitted radiates through a diffuser or lens 386 in a manner that does not let the light scatter broadly, nor does it overly restrict the scattering of the light.


In other embodiments, the walls may not be angled relative to the frontal surface 252F of the intermediate cover 252, the top wall 380 may be angled relative to the frontal surface 252F of the intermediate cover 252, and/or the angles between the walls and the frontal surface 252F of the intermediate cover 252 may be an acute angle. And in further embodiments, the size and shape of the light emitter housings 262a, 262b, 262c, 262d may vary. For example, said housing may be in the shape of a cube, cuboid, sphere, cylinder, cone, pyramid, tetrahedron, prism, torus, ellipsoid, octahedron, dodecahedron, icosahedron, hemisphere, triangular prism, pentagonal prism, hexagonal prism, or any combination thereof. The housing may be customized to accommodate different types of light emitting assemblies 264 or to achieve particular illumination effects. In some cases, the light emitter housings 262a, 262b, 262c, 262d may incorporate lenses, reflective or diffusive surfaces to shape the light output. The housings 262a, 262b, 262c, 262d may also include features to facilitate heat dissipation from the light emitting assemblies 264 in certain embodiments. For example, the housings 262a, 262b, 262c, 262d could include fins, channels, or other structures to increase surface area for heat transfer. The housings 262a, 262b, 262c, 262d may also be made of thermally conductive materials like aluminum or copper alloys to enhance heat dissipation. The housings 262a, 262b, 262c, 262d may incorporate electromagnetic shielding materials or structures to prevent interference between the light emitting assemblies 264 and other electronic components in the robot head. This could include conductive coatings, metal mesh layers, or other EMI shielding techniques integrated into the housing design. Further, the housings may be omitted and the illumination assembly 223 may be a part of or integrally formed with any aspect of the head housing assembly 220.


ii. Electronics Support


The electronics support 254 is mounted to a generally central area of the intermediate cover 252 and is configured to position a display 300 included in the electronics assembly 222 between the intermediate cover 252 and the frontal shell 228 within the second head sub-volume 238. The electronics support 254 includes a base coupling 266 configured to mount to the intermediate cover 252 and a display coupling 268 configured to mount the display 300 to the electronics support 254. The base coupling 266 is located above and rearward of the display coupling 268. The display coupling 268 positions the display 300 in spaced apart relation to the frontal shell 228 as shown in FIG. 15. As shown in FIGS. 26 and 27, the intermediate cover 252 is configured to lie behind the electronics assembly 222 to locate at least a portion of the electronics assembly 222 between the intermediate cover 252 and the frontal shell 228.


The base coupling 266 has a width that is less than the intermediate cover 252 to locate the electronics support 254 within the perimeter 256 of the intermediate cover 252. The base coupling 266 includes left and right support brackets 266a, 266b configured to support portions of the electronics assembly 222 above the display 300. In the illustrative embodiment shown in FIGS. 9-12, a sensor assembly 301 including three upper cameras 302a1, 302a2, 302a3 is mounted to both of the brackets 266a, 266b and oriented to face in a forward direction of the robot 100.


iii. Frontal Shell


The frontal shell 228 is configured to cover at least the intermediate cover 252 and the electronics assembly 222 as shown in FIGS. 1-16. The frontal shell 228 forms a forwardmost, exterior surface of the head portion 202a and cooperates with the intermediate cover 252 to define the second sub-volume 238 in the housing assembly 220. The frontal shell 228 may be made from a transparent material so that the display 300 mounted on the electronics support 254 may be viewed therethrough. In other embodiments, the frontal shell 228 may be tinted or opaque. The intermediate shell 252 can be similarly colored (tinted or opaque) similarly so that the frontal shell 228 has a similar appearance to the intermediate shell 252 and so that lights displayed on the display 300 are the only items visible or displayed/conspicuous through the frontal shell 228. The frontal shell may also include micro-optical elements such as Fresnel lenses or diffractive optical elements in specific areas to enhance the performance of internal sensors or create novel lighting effects without additional hardware.


The frontal shell 228 may be coated, etched, or formed in with a plurality of layers (e.g., examples of which are disclosed within U.S. Pat. Nos. 8,770,749; 9,134,547, 9,383,594; 9,575,335; 9,910,297, all of which are incorporated herein by reference) in a manner that improves durability, increases sensor accuracy, filters one or more specific wavelengths, reduce glare, enhance appearance, reduces fog, makes the frontal shell 228 easier to clean or protects it from cleaning products. Examples of such optical coatings include anti-reflection coatings, mirror coatings, hard coatings, anti-static coatings, anti-fog coatings, some of which are described within U.S. patent application Ser. Nos. 16/896,016, 16/698,775, 16/417,311, 16/126,983, 15/359,317, 15/515,966, each of which are incorporated herein by reference. Further, the material composition, shape, number of layers, composition of said layers of the frontal shell 228 may be different from the material composition, shape, number of layers, composition of said layers utilized within other parts of the frontal shell 228. In other words, the composition, shape, number of layers, composition of said layers may vary across the frontal shell 228. It should be understood that this disclosure is not limited to just the information that is disclosed within those applications; but instead should include any compositions, shapes, layer numbers, compositions of layers that are known in the art or are obvious in light of what is known in the art.


The frontal shell 228 or an extent of the frontal shell 228 may have a substantially uniform thickness, which may be equal to or greater than 1 mm and preferably greater than 2 mm. Additionally, the frontal shell 228 may be optically correct and may not be a corrective lens. As such, the frontal shell 228 has a dioptric power of less than 0.25 diopters, preferably less than 0.12 diopters, and most preferably less than 0.06 diopters. The frontal shell 228 may have a reverse/negative pantosope tilt, a forward/positive pantosope tilt, or no pantoscope tilt. Accordingly, the frontal shell 228 may be made from or may include polycarbonate (PC), acrylic (PMMA), trivex, nylon, gorilla glass (aluminosilicate glass), thermoplastic polyurethane (TPU), high-grade glass, cr-39, polyethylene terephthalate (PET), polystyrene, fused silica (quartz glass), borosilicate glass, polyurethane, cellulose acetate, polyvinyl chloride (PVC), cellulose acetate butyrate (CAB), polyvinyl butyral (PVB), optical-grade resin, sapphire glass, polyetherimide (PEI), lexan, thermoset plastics, other anti-scratch coated plastics, or any other similar material that is known in the art.


In the illustrative embodiment shown in FIGS. 1-16, the frontal shell 228 is mounted to the intermediate cover 252 and/or the rear shell 234 along a shield interface 320 at the outer perimeters 256, 260 of the rear shell 234 and the frontal shell 228. A rear facing edge 322 of the frontal shell 228 is configured to mount with: (i) a frontal edge 253 of the rear shell 234, (ii) a frontal extent of the intermediate cover 252, and/or (iii) both to form the shield interface 320. As shown in this embodiment, the shield interface 320 is not flat or planer. Instead, the shield interface 320 is irregular due to the fact that the rear edge 322 of the frontal shell 228 is formed to include a plurality of recesses 324. Each recess 324 is sized to receive a respective light emitter housing 262a, 262b, 262c, 262d formed in the intermediate cover 252 and that house a respective light emitting assemblies 264. In some embodiments, the frontal shell 228 itself can include the light emitter housings 262a, 262b, 262c, 262d, although this may not be desirable as the frontal shell 228 is the component that is most likely to be removed from the housing assembly 220 to service the head portion 202a and the electronics assembly 222.


Except for the recesses 324, the rear facing edge 322 of the frontal shell 228 is substantially planar along the interface 320. The shield interface 320 between the intermediate cover 252 and the frontal shell 228, and the rear-facing edge 322, extends at an angle A2 to the third plane 106 as shown in FIGS. 5 and 10. In the illustrative embodiment, the angle is within a range of about 15 degrees to about 50 degrees, preferably between 20 and 40 degrees, most preferably between 25 and 35 degrees, and may be approximately 30 degrees. This angle relationship provides the frontal shell 228 with a larger depth at a top end thereof to increase a volume of the second sub-volume 238 and provide more room for components of the electronics assembly 222. An upper end of the frontal shell 228 near an upper extent of the head portion 202a is located rearward of the third plane 106 while a lower end of the frontal shell 228 near a chin region 355 is located forward of the third plane 106 as shown in FIG. 5A. There are no recesses formed in the frontal shell 228 in the orbital region 368, the nasal region 357, the oral region 366, or the frontal region 362. The frontal shell 228 extends upward from an extent of the rear shell 234 that is positioned in the chin region 355, over a majority of the facial region 347, and into/beyond a frontal edge of the parietal region 360. The frontal shield 228 has an outer surface occupying at least an orbital region 368 and a nasal region 357. The orbital region 368 of the frontal shield 228 is not recessed in comparison to the nasal region 357 of the frontal shield 228.


The depth change of the frontal shell 228 positions a first light emitting assembly 264a and a third light emitting assembly 264c above and rearward of a second light emitting assembly 264b and a fourth light emitting assembly 264d. Such an arrangement provides a greater viewing area for users to observe at least one light emitting assembly 264 when positioned at different orientations to the robot 100. In other embodiments, the first and third light emitting assemblies 264a, 264c may not be positioned rearward of the second and/or fourth light emitting assemblies 264b, 264d. Instead, second and/or fourth light emitting assemblies 264b, 264d may be positioned in the same vertical plane, and/or may be positioned rearward of the first and third light emitting assemblies 264a, 264c.


The frontal shell 228 may: (i) wrap from the front of the head into the side regions of the head, (ii) extend into the chin area or cover the entire chin area, (iii) have a non-uniform rear edge, which may be formed by a plurality of recesses. The plurality of recesses may be configured to receive an extent of the light emitter housings 262a, 262b, 262c, 262d. In some aspects, the frontal shell 228 may not extend to the crown of the head and/or may not extend rearward past a location where a human's cars would be located. The frontal shell may occupy between 50% and 25% of the head portion 202a and may be curved in at least two directions (e.g., vertically and horizontally). In some embodiments, the frontal shell 228 and the display 300 may be integrated into a single component or may be formed from a plurality of components. The frontal shell 228 may have a different curvature than the display 300. In other embodiments, the frontal shell 228 may extend to the crown of the head or past the cars. The frontal shell 228 may occupy more or less of the head portion 202a in some cases. The curvature and integration of the frontal shell 228 and display 300 may vary in different implementations.


As shown in FIGS. 9 and 10, the frontal shell 228 may include, at least: (i) a first arc length 1120 at a first location or horizontal plane 120, a second arc length 1122 at a second location or horizontal plane 122, (iii) a third arc length 1124 at a third location or horizontal plane 124, and (iv) a fourth arc length 1126 at a fourth location or horizontal plane 126. Although the arc lengths are shown as straight lines in FIG. 9, it should be understood that the arc lengths follow the curvature of the frontal shell along each respective horizontal plane from the rear edge 322 one each side of the frontal shell 228. Frontal shell 228 is concave relative to the display 300 at each location 120, 122, 124, 126 such that the frontal shell extends at least partially about the display 300. The first arc length 1120 occurs below the display 300. The second arc length 1122 occurs at the center C of the head portion 202a, is aligned with or intersects the display 300, and is greater than the first arc length 1120. The third arc length 1124 occurs above the display 300, and is greater than the second arc length 1122. The fourth arc length 1126 occurs above the third arc length 1124 and the display 300 and is less than the second and third arc lengths 1122, 1124. As shown in FIG. 4, the frontal shell 228 has, at least: (i) a first width W1 at a first location below the first or horizontal plane 120, a second width W2 at a second location or at the horizontal plane 102, and (iii) a third width W3 at a third location above the first plane 102. The first width W1 is less than both the second and third widths W2, W3, the second width W2 is greater than both the first W1 and third widths W3, and the third width W3 is greater than the first width W1, and less than the second width W2.


In some embodiments, the frontal shell 228 may incorporate a variable transparency feature, allowing it to switch between transparent, translucent, and opaque states. This could be achieved through the use of electrochromic materials or liquid crystal layers embedded within the shell structure. Such a feature may enable dynamic control over the visibility of internal components and displays, enhancing both functionality and aesthetics. The frontal shell may also incorporate embedded flexible electronic circuits, displays, and/or have conductive pathways. These could serve multiple purposes, such as acting as antennas for improved wireless communication, providing touch-sensitive areas for user interaction, or enabling localized heating to prevent fogging in challenging environments. The frontal shell may utilize an advanced multi-layer coating system, combining various functional properties. For example, a hydrophobic outer layer for water repellency, a middle layer with self-healing properties to repair minor scratches, and an inner layer with electromagnetic shielding capabilities to protect sensitive electronics from interference. Further, the frontal shell may include integrated micro-lens arrays or diffractive optical elements. These could be used to enhance the performance of internal sensors, create specific lighting effects, or even project information onto nearby surfaces without the need for additional hardware. In some aspects, while the head portion has a modular design, the frontal shell could also have a modular design to allow for easy replacement or customization of specific sections. This could include interchangeable panels with different optical properties, sensor arrays, or display technologies, enabling rapid adaptation to various operational requirements or upgrades.


iv. Rear Shell


The rear shell 234 is shaped to resemble the curvature of a rear and sides of the head, or at least portions of a parietal region 360, an occipital region 359, a temporal region 350, an auricular region 356, a zygomatic region 358, a mastoid region 364, a buccal region 352, and a parotid region 354. As shown in FIGS. 5B and 15B, the interface region 320 between the rear shell 234 and the frontal shell 228 passes through the buccal region 352, the parotid region 354, the zygomatic region 358, the temporal region 350, the parietal region 360, and the crown region 270. In other words, the rear shell 234 begins at the parotid region 354, the zygomatic region 358, the temporal region 350, the parietal region 360, and the crown region 270 and forms all portions of the head portion 202a rearward thereof. Likewise, the frontal shell 228 begins at the parotid region 354, the zygomatic region 358, the temporal region 350, the parietal region 360, and the crown region 270 and forms all portions of the head portion 202a forward thereof except for a chin region 355 which is included in the rear shell 234. Also, it should be understood that the frontal shell may only be positioned forward of the auricular region 356.


The rear shell 234 may be configured to cover a rear portion of the electronics assembly 222 and to form a rear end of the head portion 202a. The rear shell 234 may extend downward from a top central position and forward at an angle substantially similar to the rear facing edge 322 of the frontal shell 228. The rear shell 234 may include a forward facing edge 298 configured to mate with the rear facing edge 322 of the frontal shell 228 and/or with the intermediate cover 252. As shown in FIG. 16, the rear shell 234 may have an outer rim 255 and a ledge 257 projecting inwardly from the outer rim 255 and located in the first sub-volume 236. The rim 255 may have a slightly larger outer perimeter 259 than the outer perimeter 256 of the intermediate cover 252 to allow the intermediate cover 252 to fit within the rim 255 and engage the ledge 257.


The ledge 257 and the intermediate cover 252 may have corresponding attachment holes 261a, 261b that receive a fastener to mount the intermediate cover 252 to the rear shell 234. Apertures 261c formed in the frontal shell 228 may also receive a fastener to mount the frontal shell 228 to both the intermediate cover 252 and the rear shell 234. Additionally, the interface region between the rear shell 234 and frontal shell 288 may include an interlocking mechanism, such as a tongue-and-groove design or a series of small tabs and slots, to provide a more secure connection in addition to or instead of fasteners. This could improve the overall structural integrity of the head assembly. The interface region may also incorporate other types of fasteners or connection mechanisms, such as snap-fit connections, magnetic attachments, bayonet couplers, threaded couplers, friction-fit couplers, quick-release couplers, ball-and-socket joints, twist-lock couplers, dovetail joints, latch mechanisms, spring-loaded couplers, sliding locks, compression couplings, cam-lock couplers, clamp-on attachments, pin-and-hole connections, key-and-slot joints, or any combination thereof. In some aspects, the interface region may utilize multiple types of fasteners or connection mechanisms in different areas to optimize assembly, disassembly, and structural support.


The rear shell 234 may include a chin projection region 263 that extends forward from the rim 255 and defines a lower end of the rear shell 234. The intermediate cover 252 may include a lower mount 265 that engages and rests on the chin projection region 263. In some aspects, the chin projection 263 and the lower mount 265 each include corresponding attachment holes 267a, 267b that receive a fastener to secure the intermediate cover 252 to the rear shell 234 at a lower end thereof. Apertures 267c formed in the frontal shell 228 may also receive a fastener to mount the frontal shell 228 to both the intermediate cover 252 and the rear shell 234. Alternatively, the apertures 267c and fasteners may be replaced using snap-fit connections, magnetic attachments, interlocking geometries, bayonet couplers, threaded couplers, friction-fit couplers, quick-release couplers, ball-and-socket joints, twist-lock couplers, dovetail joints, latch mechanisms, spring-loaded couplers, sliding locks, compression couplings, cam-lock couplers, clamp-on attachments, pin-and-hole connections, key-and-slot joints, other similar couplers, or any other known couplers. In other embodiments, the chin projection region could be designed as a separate, detachable component that can be swapped out for different shapes or sizes.


The rear shell 234 may incorporate a multi-layered structure with varying properties to optimize functionality and performance. The outer layer can provide protection against impacts and environmental exposure, while inner layers help absorb shocks and vibrations to safeguard internal components. The composition of the rear shell 234 may vary across different regions to balance structural support and flexibility, accommodating movement or internal component adjustments. Integrated cable management channels or conduits can improve the routing of wires and connectors, enhancing both aesthetics and ease of maintenance. These channels may include secure covers to prevent accidental damage. The rear shell 234 may also feature modular sections that are removable or adjustable for customization and upgrades, such as access panels for frequently serviced components. Magnetic attachments can be used to secure external accessories, enabling quick reconfiguration without mechanical fasteners. Additionally, interlocking geometries along edges or connection points can improve assembly precision and distribute mechanical loads more evenly, enhancing the structural integrity of the device. Advanced designs may embed sensors or connectors within the shell to detect impacts, provide real-time feedback on structural integrity, or facilitate internal power and data transfer without visible wiring, contributing to a cleaner, streamlined design.


The rear shell 234 may be formed or include silicone elastomers, thermoplastic polyurethane (TPU), shape-memory polymers (SMPS), polydimethylsiloxane (PDMS), polyurethane, liquid silicone rubber (LSR), urethane rubber, vinyl (PVC) skin, soft thermoplastic elastomers (TPE), elastomeric alloys, acrylonitrile butadiene styrene (ABS) blends, high-density polyethylene (HDPE) blends, conductive polymers, carbon nanotube-infused elastomers, magnetic shape-memory alloys, electroactive polymers (EAPS), styrene-butadiene rubber (SBR), thermoplastic vulcanizates (TPV), polyurea elastomers, medical-grade synthetic skin materials, thermoplastic olefins (TPO), fluoroelastomers, chloroprene rubber, ethylene propylene diene monomer (EPDM) rubber, polyacrylamide hydrogels, polycaprolactone (PCL), photocurable resins, elastomeric composites, phosphorescent elastomers, thermochromic materials, electrostrictive polymers, piezoelectric polymers, superelastic alloys, microcellular foams, hyperelastic materials, viscoelastic gels, nanocomposite elastomers, fabrics, metal, other similar plastics or polymers, any combination of the above, and/or any other similar material known in the art.


The rear shell 234 may be fabricated using various manufacturing techniques, each offering unique benefits based on design requirements, material properties, and production efficiency. These methods include injection molding, dip molding, casting, additive manufacturing methods (e.g., stereolithography, fused deposition modeling (FDM), and selective laser sintering (SLS)), spray coating, lamination or layering, electrospinning, sculpting or precision machining, thermoforming, or other similar manufacturing methods or combinations thereof. In one exemplary hybrid approach, the primary structure of the rear shell 234 could be 3D printed using a high-strength polymer, such as nylon or polycarbonate, to achieve both durability and complex geometries. Overmolding certain regions with softer elastomeric materials, like thermoplastic elastomers (TPE) or silicone rubber, can enhance flexibility, shock absorption, and user comfort in areas subject to impact or repeated stress. This combination ensures a balance between rigidity and pliability, improving both mechanical performance and ergonomic benefits. Furthermore, additional post-processing steps can be incorporated to tailor the rear shell 234 for specific applications. In some cases, conductive coatings may be applied to enable electromagnetic shielding or to support smart functionalities, such as integrating sensors or antennas directly into the shell's surface. Additionally, certain regions of the rear shell, particularly around joints or flex points, could incorporate shape-memory alloys or polymers. These materials could allow for controlled deformation and return to original shape, accommodating movement while maintaining overall structural integrity. Further, the rear shell could feature integrated cable management channels or conduits, integrated cooling channels or heat sinks, or separate, detachable component


3. Electronics Assembly

The electronics assembly 222 contained in the head portion 202a may include one or more of: (i) a sensor assembly 301, (ii) a display 300, (iii) an illumination assembly 223 that includes at least one, and preferably a plurality of, light emitting assembly 264, and (iv) other electronics (e.g., IMU, RFID reader, location sensors (e.g., Global Positioning System (“GPS”), GLONASS, Galileo, QZSS, and/or iBeacon), etc.), and/or PCBs for connecting the electronics. As shown in at least FIGS. 2-4 and 8-12, at least some components of the electronics assembly 222 may be mounted to the electronics support 254 and located above the display 300 or to any other structure of the head and neck assembly 202. Mounting the electronics assembly 222 to the electronics support 254 above the display 300 and within the second sub-volume 238 may arrange the individual items in a space-saving manner and leave other areas of the head and neck assembly 220 open for storage of other components that do not need direct access to areas outside of the robot 100 such as sensors, cameras, displays etc. As noted above, the housing assembly 220 is configured to enclose the electronics assembly 222 without interfering with the transmission or reception of signals. For example, the housing assembly 220 does not obscure the line of sight or field of view of the sensors.


i. Display


As best shown in FIGS. 8-13, the display 300 of the electronics assembly 222 may be mounted to the electronics support 254 and positioned behind the frontal shell 228. The display 300 may be operatively connected to at least one processor to generate and display status messages and other information on the display 300. For example, the display 300 may display information: (i) related to the robot's state (e.g., working, error, moving, charging, or idle), (ii) obtained from sensors contained within the head assembly 202a, including but not limited to cameras 302, 303, proximity sensors, temperature sensors, and accelerometers, or (iii) received from other processors in communication with the display 300 (e.g., other internal processors housed within the robot 100 or external information transmitted and received by the robot). The display 300 may also be configured to display alerts or warnings, such as low battery notifications, obstacle detection alerts, or maintenance reminders. The information displayed on the display 300 may be presented in various formats, including text-based messages, graphical icons, animations, and dynamic visual indicators. These visual indicators may take the form of color-coded blocks, well-known shapes, logos, or other moving items (e.g., thought bubbles, arrows indicating direction of movement, or animated progress bars). The display 300 may further support interactive features, allowing users to provide inputs via touch or proximity gestures, depending on the implementation. However, the information displayed on the display 300 may be restricted from showing human facial features (e.g., eyes, mouth, nose) to avoid any anthropomorphic representations that might be confusing or misleading to users. Instead, the display is designed to convey functional and operational information in a clear and efficient manner that enhances user interaction without mimicking human expressions.


As shown in FIGS. 7A-16, the display 300 may have a substantially rectangular display surface that has a convex curvature conforming to the curvature of the frontal shell 228 of the housing assembly 220. The curvature of the display 300 provides an aesthetically pleasing integration with the overall design of the device and enhances the seamless appearance of the frontal surface. The display 300 may be slightly tilted downward to improve visibility and user interaction. For example, the display 108.4 may be tilted from a horizontal plane at an angle of about 5.7 to about 8.6 degrees, or about 6.4 to about 7.9 degrees. This tilt configuration improves ergonomic usability by aligning the display with a typical user's line of sight, reducing the need for head or neck adjustment during prolonged use. The downward tilt also helps to mitigate unwanted glare and reflections from ambient light sources, thereby enhancing the clarity and readability of the display content under various lighting conditions. The display 300 may incorporate any known technology or feature to achieve optimal performance and energy efficiency, including but not limited to: liquid crystal display (LCD), light-emitting diode (LED), organic light-emitting diode (OLED), laser phosphor display (LPD), interferometric modulator display (IMOD), quantum dot light-emitting diode (QDLED), micro-light-emitting diode (mLED), active-matrix organic light-emitting diode (AMOLED), surface-conduction electron-emitter display (SED), field emission display (FED), plasma display, electronic paper or electrophoretic display (EPD), MicroLED, quantum dot display, LED-backlit liquid crystal display (LEC), white liquid crystal display (WLCD), organic liquid crystal display (OLCD), transparent OLED, passive-matrix OLED (PMOLED), capacitive touch display, resistive touch display, e-ink display, other bistable display technologies, monochrome displays, color displays, or any combination thereof. The display may also include advanced features such as high dynamic range (HDR), anti-reflective coatings, wide color gamut (WCG), variable refresh rates, adaptive brightness control, and touch sensitivity enhancements. The selection of display technology can be tailored to specific use cases, such as low-power consumption for battery-operated devices or high-resolution imaging for applications requiring detailed visual output. Furthermore, the display 300 may support additional functionalities, such as multi-touch input, gesture recognition, and/or haptic feedback. The display may be configured to operate in different modes, including a low-power mode for extended battery life or a high-brightness mode for outdoor visibility.


In addition, the display 300 may be segmented into multiple independently controllable zones. This would allow for selective activation of display areas, potentially conserving power or enabling more complex information presentation strategies. For example, only the relevant portions of the display 300 could be activated based on the robot's current task or status. Further, the display 300 could utilize adaptive brightness and contrast adjustment based on ambient lighting conditions, ensuring optimal visibility across a wide range of environments. This feature could be particularly useful for robots operating in variable lighting conditions. Additionally or alternatively, the display 300 may incorporate augmented reality (AR) elements, overlaying digital information onto the real-world view seen through transparent portions of the display. This could enhance the robot's ability to provide context-aware information or instructions. The display 300 disclosed herein meet the standards described in FDA CFR Title 21 part 1040.10, titled Performance standards for Light-Emitting Products, and ANSI LIA Z136.1, titled Safe Use of Lasers, at the time of filing this application, both of which are incorporated herein by reference. In other embodiments, the robot 100 may include a projection system in addition to or instead of the integrated display. This could allow for displaying information on nearby surfaces or creating holographic-like interfaces in the space in front of the robot 100.


It should be understood that this application contemplates the use of at least one display and potentially a plurality of displays (e.g., between 2 and 5). Additionally, this Application also contemplates utilizing displays that have different sizes. To this end, the display may extend between any of the lines shown in FIG. 13. For example, the display may extend between the third from the bottom line to the third line from the top. Additionally, each of the lines on the display 300 in FIG. 13 can represent different zones included in the display 300 and used to convey different images or other visual representations across the display 300. As an example, sides of the display 300 can be used to display a different image or visual representation compared to a front of the display 300. Alternative display sizes may be used to: (i) reduce the surface area of fragile elements within the robot, (ii) because the robot is not designed to work near humans, (iii) additional area within the head is needed for sensors or other electronics, or (iv) any other reason known by one of skill in the art. The disclosed display may be embedded in or occupy the entire frontal shell 228, between 100% and 75% of the frontal shell, between 75% and 50% of the frontal shell, between 50% and 25% of the frontal shell, or less than 25% of the frontal shell. In some examples, the display may utilize the full frontal shell 228.


As illustrated in FIG. 13, the display 300 is designed to curve along the contours of the humanoid robot's facial structure, aligning with various regions of the face to provide a seamless visual interface. Specifically, the display 300 conforms to and presents information through all or portions of key or important facial regions, including the orbital region 368, which surrounds the eyes, the frontal region 362 encompassing the forehead, the temporal region 350 located near the temples, the zygomatic region 358 corresponding to the cheekbones, the nasal region 357 covering the nose, the infraorbital region 363 beneath the eyes, the buccal region 352 associated with the checks, and the oral region 366 surrounding the mouth. This curvature enhances the robot's ability to display expressive content or important visual cues directly on its face, improving user interaction and engagement. Conversely, the display 300 does not extend to any regions located rearward of the interface 320. These excluded regions include the mental region 355, corresponding to the chin, the auricular region 356 encompassing the cars, the crown region 270 at the top of the head, the parietal region 360 located on the sides and upper back of the head, the occipital region 359 at the lower back of the head, and the mastoid region 364 behind the cars. By limiting the display to the frontal facial areas, the design ensures that visual outputs remain within the primary field of view of human observers, optimizing the robot's communicative abilities. In some embodiments, the display 300 may be configured to present information, indications, or dynamic visual representations across a broader surface area. For example, the display may cover: (i) the entire frontal shell 228 of the robot's head, providing a comprehensive visual interface; (ii) a majority of the frontal shell 228, focusing on the most expressive regions; (iii) the entire facial region 241, ensuring full-face display coverage; or (iv) a majority of the facial region 241, balancing display utility with structural design considerations. Alternatively, in certain embodiments, the display 300 may be entirely omitted from the facial region or repositioned to other parts of the robot's body. For instance, the display could be integrated into the robot's torso or another suitable location, depending on the specific application requirements or design preferences. This flexibility in display placement allows for customization based on the intended use case, whether it be for humanoid interaction, information dissemination, or other functionalities.



FIGS. 7A-7D show various statuses and corresponding indications or visual representations that are contemplated by this disclosure. It should be noted that the present disclosure is not limited to these statuses and corresponding indications or visual representations and that these are merely examples that can be displayed by the display 300. FIG. 7A shows a set of information displayed by display 300 including device status (e.g. camera status and robot start-up status), robot part status (e.g. a specified joint status), and a battery level status. The various statuses disclosed herein can update as the status changes. For example, during start-up of the camera, the display 300 may display text, an icon, or another visual representation indicating that the camera is initializing and is not ready for use. The status can change to display an icon, or another visual representation indicating that the camera is active when start-up is complete. FIG. 7B shows an robot status icon. Illustratively, the status icon depicted is a pause icon. The pause icon can be displayed when the robot 100 is currently not completing any tasks or modes and is ready for instructions. Other icons corresponding to additional robot statuses that can be completed by the robot can also be displayed. FIG. 7C shows another icon indicating an alert or system failure event. The icon includes a triangle with an exclamation mark within to indicate the alert or system failure event. Such an icon may be displayed when a part of the robot 100 has failed, such as an actuator, a camera, or another device included in the robot 100. Identifying information in the form or text or another icon can also be displayed with the alert or failure icon to specify which part of the robot 100 has failed or needs attention. FIG. 7D shows an icon indicating a particular mode of the robot. Illustratively, mode depicted indicates a follow mode in which the robot 100 is engaged in following a user, another device, or another robot. The icon includes a generally human shaped icon with a border around the generally human shaped icon. Other icons corresponding to additional modes of operation of the robot can also be displayed.


ii. Head Illumination Assembly


The head illumination assembly includes at least one, and preferably a plurality of, light emitting assembly 264 are located on lateral sides of the head portion 202a. In certain configurations, the illumination assembly may be designed to visually indicate robot statuses to users viewing the robot 100 from the side. As shown in FIGS. 5B, 10B, 21, and 27, a first light emitting assembly 264a is located in a temporal region 350 of the robot's head portion 202a and a second light emitting assembly 264b is located in a buccal region 352 of the robot's head portion 202a. Third and fourth light emitting assemblies 264c, 264d are located symmetrically on the opposite side of the robot's head, and so are also located in the temporal region 350 and the buccal region 352 respectively. In some embodiments, the light emitting assemblies 264 can be located all or partially in a parotid region 354 of the robot's head portion 202a, an auricular region 356 of the robot's head portion 202a, a zygomatic region 358 of the robot's head portion 202a, a parietal region 360 of the robot's head portion 202a, a frontal region 362 of the robot's head portion 202a, or a mastoid region 364 of the robot's head portion 202a so long as the light emitters 264 are positioned on a lateral side of the robot's head portion 202a so as to be visible to a person standing next to the robot 100. These positions of the light emitting assemblies 264 allow users to view the light emitted from said light emitting assemblies 264 from the side while the robot 100 is working on a task in an assembly line, for example, and while the display 300 is facing the assembly line and may not be entirely visible by the user. Further, the light emitting assemblies 264 may face away from the display 300 so as not to obstruct the information displayed by the display 300 and face away from other sensors so as not to interfere with the sensors. Other regions of the head where the light emitting assemblies 264 are not typically found include a chin or mental region 355, orbital region 368, nasal region 357, coronal region 270, and occipital region 359.


The light emitting assemblies 264 in the head portion 202a may be configured to display a status of the robot 100, or a part thereof, to users. As such, the light emitting assemblies 264 may be able to alter their color (e.g., visible and non-visible), intensity, during of when they are on/off, etc. In one embodiment, the light emitting assemblies 264 may display a first color (i.e., green—550 nm) when the robot is engaged in a task, such as assembling a part on an assembly line. The light emitting assemblies 264 can display a second color (i.e., yellow—600 nm) when the robot 100 is not assigned to a task to indicate to users that the robot 100 is available for a task. The light emitting assemblies 264 can display a third color (i.e., red—665 nm) when the robot 100 is low on battery life and should be recharged. Additionally, the light emitting assemblies 264 may display a variety of other colors, patterns, or may utilize display sequences to convey different statuses or alerts. For example, the assemblies can display a flashing blue light—470 nm to indicate that the robot 100 is in a standby mode and awaiting further instructions. A pulsing white light can be used to indicate that the robot 100 is undergoing a system update or performing a self-diagnostic check. Alternatively, the light emitting assemblies 264 can display a purple light to indicate that the robot 100 is in a training mode, learning a new task or recalibrating its sensors. Further, the light emitting assemblies 264 can blink repeatedly to indicate that the robot 100 has lost communication with a host server or external device or is attempting to pair or searching for a device or server to connect to. Finally, the light emitting assemblies 264 may emit non-visible light (e.g. infrared, ultraviolet) that enables the robot 100 to communicate with other robots or systems equipped with appropriate sensors.


In some embodiments, the light emitting assemblies 264 may also use dynamic lighting patterns, such as slow pulses, fast blinks, or color gradients, to communicate additional information. For instance, a gradual transition from green to yellow may indicate that the robot 100 is completing a task and will soon be available. A rapidly blinking red light may signal a critical error or an emergency stop condition, prompting immediate attention from users. The light emitting assemblies 264 and/or display 300 can also be used to indicate when a component in the head portion 202a and/or neck portion 202b, such as an actuator or sensor, is malfunctioning and should be serviced. For example, a specific color or pattern may correspond to different types of malfunctions. A steady orange light could indicate a minor issue that requires maintenance but does not immediately impact the robot's performance, while a flashing red-and-white pattern could signal a major fault that requires immediate servicing. In another example, the light emitting assemblies 264 can display a particular color that corresponds with the information displayed on the display 300. In the robot 100 is running low on battery life, the light emitting assemblies 264 can display a red color while the display displays a message and/or icon that indicates that the batter is low.


The light emitting assemblies 264 may also be synchronized with audible alerts or haptic feedback mechanisms to ensure that users are promptly notified of the robot's status, even in environments where visual indicators may be less noticeable. For instance, a flashing light paired with a beeping sound can signal an urgent issue, while a soft chime can accompany a color change indicating that the robot has completed its task and is ready for the next assignment. Light emitting assemblies 264b, 264d are positioned adjacent to an oral region 366 of the head and can be operated independently of the light emitting assemblies 264a, 264c which are located above light emitting assemblies 264b, 264d and adjacent to an orbital region 368 of the head portion 202a. Alternatively or additionally, said light emitting assemblies 264 may be able to project patterns or simple icons onto nearby surfaces.


Each of the light emitting assemblies 264 in the head portion 202a include: (i) a light source or light emitter 902, and (ii) a diffuser lens 386 extending between end walls 376, 378, a top wall 380, a bottom wall 382, and an interior wall 384. The light source 902 and the diffuser lens 386 form a unit that is inserted together into each respective light emitter housing 262 to couple the light emitting assemblies 264 to the head portion 202a. The light source or emitter 902 can include any known light emitter, including any one or more of the following: laser, LCD, LED (e.g., COB LED), OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, fiber optic light guides that distribute light from a central source to multiple output points on the head surface, monochrome, color, or any combination of the above, or any other known technology or light feature. It should be understood that in other embodiments, the above disclosed light sources or emitters 902 and/or additional light emitters 902 may be formed in any desirable configuration or used with any other material, structure, or component to form the desirable light emitting assemblies 264. Examples of said light emitting assemblies 264 that may be formed include fiber optic cables, electroluminescent (EL) wire, laser diodes, neon tubes, cold cathode fluorescent lamps (CCFL), plasma tubes, phosphorescent strips, UV LED strips, infrared LED arrays, light guide panels (LGP), edge-lit light panels. The light source or light emitter 902 may be made from a single emitter or a plurality of emitters (e.g., between 2 and 1000). Said light source or light emitter 902 may be driven by an internal or external driver within another aspect of the electronics assembly 222.


Each of the light emitters 902 is positioned in a void 904 of each respective light emitter housing 262 (i.e. toward the display 300) and the diffuser lens 386 is positioned in front of the light emitter 902 to reside between a frontal extent of the light emitter 902 and an outermost edge of each respective light emitter housing 262 and/or an outermost edge 323 or surface 325 of the head 202a/shield 228. In some embodiments, the diffuser lens 386 can be omitted from the assemblies 264. Each light emitter 902 is located forward of and adjacent to the front facing edge 298 of the rear shell 234 and rearward of and adjacent to a portion of the rear facing edge 322 of the frontal shell 228 defining each recess 324. A rear most edge of each light emitter 902 is located rearward of the entire rear facing edge 322 of the frontal shell 228. The voids 904 are located between the frontal shell 228 and the rear shell 234. In other embodiments, the light emitters 902 may not be formed in the voids 904 and instead said voids 904 may act as a reflector for light that is emitted from said light emitter or source 902. In other words, the light emitter 902 may be positioned in the first head sub-volume 236.


The recesses 324 formed in a rear edge of the frontal shell 228 each define a gap or channel 327 between the frontal shield 228 and the rear shell 234, and light emitted from the illumination assembly 223 is visible in each gap 327. An extent of the head portion 202a is provided by the peripheral projections 262. The extent is recessed relative to both: (i) a first location 333 on the outer surface 325 of the frontal shell 228 that is positioned adjacent to the gap 327, and (ii) a second location 325 on the outer surface 329 of the rear shell 234 that is positioned adjacent to said gap 327. The rear edge 322 of the frontal shell 228 does not abut the frontal edge 298 of the rear shell 234 at a location corresponding to the gaps 327. In other words, an extent of the portion of the light emitter housings 262a, 262b, 262c, 262d may be recessed relative to the outer surfaces 325, 329 of the frontal shell 228 and rear shell 234. This positional relationship may cause an extent of the head portion 202a to be positioned: (i) within the frontal shell 228 and/or the rear shell 234, and (ii) at said location to connect the frontal shell 228 to the rear shell 234. As such, the light emitting assemblies 264 may have an arc or curvilinear configuration. Light emitted from the illumination assembly 264 may obscure an extent of the head portion 202a, and may specifically obscure an extent of the head portion 202a that has an outer surface 331 that is recessed relative to the outer surfaces 325, 329 of the frontal and rear shells 228, 234.


When viewing the head portion 202a from the front as shown in FIGS. 4 and 9, it can be seen that the light emitters 264 are spaced apart from the display 300 and the electronics support 254. In some aspects, the light emitters 264 do not reside behind or overlap with the display 300 or the electronics mount 254, although in some embodiments this may occur. The lower light emitting assemblies 264b, 264d may be positioned below the display 300 and the electronics support 254. The upper light emitting assemblies 264a, 264c may flank the display and the electronics support 254 such that a horizontal plane (i.e. plane 122) extending through the upper light emitting assemblies 264a, 264c also passes through the display 300. The plane 122 may also pass through the center C of the head portion 202a, but the upper light emitting assemblies 264a, 264c may be slightly offset upward to the plane 122. In some cases, the upper light emitting assemblies 264a, 264c may be located below a top end of the display 300 so as to be positioned below any cameras or sensors mounted to the electronics support 254 above the display 300.


iii. Sensor Assembly


The sensor assembly 301 may include a variety of sensing devices and systems to enhance the humanoid robot 100's perception capabilities and adaptability in various environments. The sensor assembly 301 may include: (i) a vision system 301V with one or more cameras 302, 303, (ii) temperature sensors to detect ambient or object temperatures for safety and operational adjustments, (iii) pressure sensors to measure contact or surface pressures, (iv) force sensors for detecting applied forces during interactions, (v) inductive sensors for proximity and metal object detection, (vi) capacitive sensors to sense touch or proximity, (vii) any combination of these sensors, or (viii) other known sensors including ultrasonic, acoustic, or gas sensors for additional environmental monitoring.


In some implementations, the vision system 301V may include a set of upper cameras 302, wherein said set of upper cameras 302 may include three upper cameras 302a, 302b, 302c that are positioned above the display/shield 300 and coupled to the electronics mount 254 as shown in FIGS. 3 and 4. A first camera 302a included in the plurality of upper cameras 302 may be directed in a forward direction relative to the robot 100, providing a primary field of vision for object recognition and navigation. Second and third cameras 302b, 302c included in the plurality of upper cameras 302 may be directed in opposed lateral directions relative to the robot 100 substantially perpendicular to the first camera 302a, ensuring broader peripheral coverage and spatial awareness.


In some embodiments, more than one upper camera 302 can face in the forward direction as shown in FIGS. 8-12. For example, three cameras 302a1, 302a2, 302a3 can be coupled to the electronics mount 254 and arranged to face in the forward direction of the robot 100. Such a configuration may provide depth perception for the robot utilizing overlapping fields of view as suggested in FIG. 12. Any number of cameras can be located above the display 300 and arranged in the forward direction of the robot 100, allowing for a customizable field of view based on specific task requirements. For instance, multiple forward-facing cameras may be configured to function as a stereo vision system, calculating depth and distance using disparity between images captured by each camera.


As shown in the figures, upper cameras 302a, 302b, 302c in the head 202a may have a forward FoVF or lateral facing field of view FoVL, FoVR or cone of vision of about 57.6 degrees to about 86.4 degrees, or about 71.1 degrees to about 79.2 degrees. For example, the field of view FoVF, FoVL, FoVR or cone of vision of the upper cameras 302a, 302b, 302c may be about 72 degrees. As shown in FIGS. 2-3, the field of view of the upper cameras 302a, 302b, 302c may not overlap or may have partially overlapping fields of view directly in front of the robot 100. In alternative embodiments, the field of view or cone of vision of the upper cameras 302a, 302b, 302c may have a minor overlap or may have a substantial overlap to improve depth perception and eliminate blind spots. Additionally and as shown in FIG. 3, the first camera 302a may have a front field of view FoVF that has a line of sight LoSF that is perpendicular to the frontal plane Pr. As shown in FIG. 5B, the line of sight LoSF is not centered in the front field of view FoVF due to optics that are used in connection with the camera 302a. The second and third cameras 302b, 302c also have fields of view FoVL, FoVR with lines of sight LoSL, LoSR that are substantially perpendicular to the line of sight LoSF.


To further enhance the versatility of the vision system 301V, the cameras 302 may feature adjustable fields of view achieved through various mechanisms. These mechanisms may include motorized zoom lenses capable of dynamically adjusting the focal length to focus on distant or close objects as needed, and wide-angle lenses combined with software-based digital zoom and cropping to provide both broad coverage and detailed inspection capabilities. In scenarios where extreme wide-angle views are required, fisheye lenses with integrated distortion correction algorithms may be employed to deliver a seamless image output.



FIGS. 8-12 shows an alternative embodiment of a sensor assembly 301, where the three cameras that were shown in FIGS. 1-7 are replaced with three forward facing cameras 302a1, 302a2, and 302a3. Each of these three cameras have fields of vision FoVL, FoVM, and FoVR, wherein these fields of vision overlap on another to aid in provide a 3D reconstruction of the robot's 1 environment. The line of sight LoSL, LoSM, and FoVR for each camera may be parallel with on another and angled relative to the frontal plane PF, wherein said frontal plane PF is parallel with the coronal plane of the robot 1. Said angle between the light of sight LoSL, LoSM, and FoVR and the frontal plane PF may be between 91 degrees and 150 degrees, and preferably 110 degrees.


In other embodiments, the sensor assembly 301 of the humanoid robot 100 may include additional or different cameras or sets of cameras 302. For example, said cameras 302 may alternatively or additionally include a set of lower cameras positioned below the display, oriented downward to monitor the area in front of the robot's feet. Such lower cameras may be beneficial for obstacle detection in close proximity to the robot, enabling it to navigate uneven terrain or avoid small objects on the ground. Additionally, the cameras 302 may include rear-facing camera(s) to monitor the area behind the robot 100, which may improve safety during backward movements or when the robot 100 operates in dynamic environments with multiple moving objects. In some aspects, side-mounted cameras on each side of the head may provide a full 360-degree field of vision, ensuring the robot can detect lateral movements and peripheral activities. In certain implementations, the cameras 302 may be mounted on adjustable or retractable arms or may be detachable to enable them to be reconfigured or repositioned based on specific operational requirements. These adjustable mounts may include motorized mechanisms to dynamically adjust the angle and position of each camera, allowing for on-the-fly adaptation to different tasks and environments. For example, during inspection tasks, the cameras may be reoriented to focus on specific areas of interest, while during navigation, they may return to a default position to maximize the robot's field of vision.


Additionally, the vision system 301V may incorporate advanced imaging techniques, such as multi-frame noise reduction algorithms, to improve image clarity in low-light conditions. Advanced autofocus systems, including phase-detection and contrast-detection autofocus, may be integrated to provide rapid and precise focus adjustments. For enhanced environmental adaptability, the vision system 301V may be equipped with polarization filters to reduce glare and improve visibility in reflective or water-covered environments. For reliability and redundancy, the vision system 301V may feature modular cameras 302 that can be hot-swapped, allowing for seamless replacement without disrupting the robot's operations. These units may also include self-cleaning mechanisms like hydrophobic coatings, ultrasonic vibration systems to dislodge dust, or small wipers to maintain lens clarity. Additionally, automated diagnostics systems could be integrated to monitor the health and performance of each camera 302, alerting the robot 100 to potential issues and enabling proactive maintenance. To enhance resilience in harsh environments, the cameras 302 may be housed in rugged enclosures with shock-absorbing mounts, protecting them from physical impacts and vibrations.


Although upper cameras 302 are shown as illustrative examples, other types of sensors may be utilized and mounted to the internal frame in a similar manner to achieve optimal directional alignment for various detection, sensing, or signal reception tasks. For example, the sensor assembly 301 or the vision system 301V may incorporate time-of-flight (ToF) sensors, structured light projectors paired with infrared cameras, or stereo cameras with variable baselines to enhance depth perception and generate accurate three-dimensional spatial maps. Additionally, radar and ultrasonic sensors may be integrated to provide redundant distance measurements, which can be particularly valuable in low-visibility conditions or dynamic environments. In certain embodiments, lidar sensors may be employed for precise long-range distance measurements, while thermal imaging cameras can detect heat signatures and monitor temperature variations. Multi-spectral or hyperspectral imaging systems may further improve object recognition by identifying materials based on their unique spectral characteristics, thereby enhancing the robot's ability to navigate and interpret complex environments.


Additionally or alternatively, the assembly 301, vision system 301V, and/or upper cameras 302a, 302b, 302c may include: (i) scan camera(s) for detailed inspection, (ii) monochrome camera(s) for improved low-light performance, (iii) color camera(s) for standard imaging, (iv) CMOS camera(s) for high-speed imaging, (v) CCD sensor(s) or camera(s) that include CCD sensor(s) for high-quality imaging, (vi) camera(s) or sensor(s) that have rolling shutter or global shutter for various imaging requirements, (vii) other types of 2D digital camera(s) for traditional imaging, (viii) other types of 3D digital camera(s) for depth mapping, (ix) camera(s) or sensor(s) that are capable of stereo vision, structured light, and laser triangulation for enhanced 3D imaging, (x) sonar camera(s) or ultrasonic camera(s) for proximity sensing, (xi) infrared sensor(s) and/or infrared camera(s) for low-light and heat detection, (xii) radar sensor(s) for distance measurement, (xiii) LiDAR for precise mapping, (xiv) other structured light sensors, camera(s), or technologies for advanced imaging, (xv) dot projecting camera(s) or sensor(s) for depth sensing, or (xvi) any combination of the above or any other known camera or sensor. In one embodiment, the camera 302 may have a megapixel resolution of between 0.4 MP to 20 MP, may record video at 5.6 FPS to 286 FPS, may have a CMOS sensor, pixel size may range from 2.4 μm to 6.9 μm, may utilize a starves rolling shutter technology, can operate in 55-degree Celsius ambient air temperatures, and may have any other properties, technologies, or features that are discussed within U.S. Pat. Nos. 11,402,726, 11,599,009, 11,333,954, or 11,600,010, all of which are incorporated herein by reference. It should be understood that the cameras are typically configured as video cameras but may have an alternative configuration, such as an image camera or multi-functional camera capable of capturing both still images and video footage.


4. Neck Portion

As shown in at least FIG. 3, the neck portion 202b includes a deformable cover and member 230 that is designed to extend from an upper portion of the torso 204 to a lower portion of the head portion 202a. In particular, the deformable cover and member 230 is configured to wrap around at least an edge portion of the rear shell 234. In doing so, the deformable cover and member 230 obscures the actuators and other electronics contained in the neck.


5. Distances, Angles, Radii, and Arcs

The following tables include non-limiting examples of distances, angles, radii and arcs. Additionally, while the entire figure set is not to scale, it should be understood that the components contained in within each Figures are generally to scale and as such comparison, ratios, and/or other information can be derived from the Figures and even supplemented by the information contained in the below tables.













TABLE 1





Distance


Preferred
Preferred


(mm)
Lower Bound
Upper Bound
Lower Bound
Upper Bound



















D1 
61.7
92.6
69.5
84.9


D2 
113.6
170.4
127.8
156.2


D3 
80.6
120.9
90.7
110.8


D4 
177.2
265.8
199.4
243.7


D5 
61.4
92.1
69.1
84.4


D6 
81.5
122.3
91.7
112.1


D7 
80.5
120.8
90.6
110.7


D8 
76.1
114.2
85.6
104.7


D9 
80.6
120.9
90.7
110.9


D10
34.2
51.2
38.4
47.0


D11
61.7
92.6
69.4
84.8


D12
137.6
206.4
154.8
189.2


D13
139.1
208.6
156.5
191.2


D14
20.1
30.2
22.6
27.7


D15
67.2
100.8
75.6
92.4


D16
48.0
72.0
54.0
66.0


D17
28.8
43.2
32.4
39.6


D18
136.3
204.5
153.3
187.4


D19
13.6
20.4
15.3
18.7


D20
137.0
205.5
154.1
188.3


D21
20.8
31.2
23.4
28.6


D22
93.3
140.0
105.0
128.3


D23
173.6
260.4
195.3
238.7


D24
166.2
249.4
187.0
228.6




















TABLE 2





Angle


Preferred
Preferred


(Degrees)
Lower Bound
Upper Bound
Lower Bound
Upper Bound



















A1
5.7
8.6
6.4
7.9


A2
55.1
82.6
62.0
75.8


A3
54.8
82.2
61.6
75.3




















TABLE 3





Radius


Preferred
Preferred


(mm)
Lower Bound
Upper Bound
Lower Bound
Upper Bound



















R1 
99.7
149.6
112.2
137.1


R2 
46.0
69.0
51.7
63.2


R3 
63.7
95.5
71.6
87.6


R4 
66.6
99.9
74.9
91.6


R5 
25.7
38.5
28.9
35.3


R6 
62.7
94.0
70.5
86.2


R7 
122.4
183.6
137.7
168.3


R8 
101.8
152.8
114.6
140.0


R9 
75.2
112.8
84.6
103.4


R10
127.7
191.6
143.7
175.6


R11
79.3
118.9
89.2
109.0


R12
69.0
103.5
77.6
94.8




















TABLE 4





Arc


Preferred
Preferred


(Degrees)
Lower Bound
Upper Bound
Lower Bound
Upper Bound



















Arc1
119.7
179.5
134.6
164.5


Arc2
115.1
172.6
129.5
158.2


Arc3
162.5
243.8
182.8
223.5


Arc4
110.8
166.2
124.6
152.3


Arc5
117.1
175.6
131.7
161.0


Arc6
65.8
98.7
74.1
90.5


Arc7
39.6
59.5
44.6
54.5


Arc8
102.4
153.6
115.2
140.8


Arc9
88.0
131.9
99.0
120.9









D. Alternative Embodiments of the Head and Neck Assembly


FIGS. 16-65C show thirteen alternative embodiments of the head and neck assembly that may be used interchangeably in connection with the different embodiments of the humanoid robots shown in FIGS. 1, 17, and 22. Each of the alternative embodiments has slightly different structures and configurations, but each embodiment is similar in structure, features, and generally tapered shape. As discussed below, the features and/or components of each embodiment may be interchanged, added, or removed from features and/or components of other embodiments. For example, a lower recess 3310 that is formed in the chin or mental region 355 of the third embodiment (e.g., FIGS. 22-49) may be removed similar to how the lower recess was removed from the fourth, fifth, and sixth embodiments (e.g., FIGS. 50-52). Additionally, utilizing a deformable material in the neck assembly and including actuators in the neck assembly to allow the head to move may be added to the head and neck assemblies 202 of the first embodiment. It should be understood that these are only examples and any feature and/or component may be interchanged, added or removed from any other embodiment disclosed herein.


1. Second Embodiment

Similar to the head and neck assembly 202 described above in connection with FIGS. 1-15B, FIGS. 16-21 illustrate a second embodiment of a head and neck assembly 2202. For the sake of brevity, the above disclosure in connection with the head and neck assembly 202 will not be repeated below, but it should be understood that like numerals represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 2300. Further, it should be understood that the functionality and operation of the head and neck assembly 2202 is similar or identical to the features and functionality disclosed with respect to the head and neck assembly 202. It should be understood that any one or more features of the head and neck assembly 202 may be used in combination with those disclosed with respect to the head and neck assembly 2202, and that any one or more features of the head and neck assembly 2202 may be used in combination with those disclosed with respect to the head and neck assembly 202. The primary differences between head and neck assembly 202 and head and neck assembly 2202 include: (i) the housing assembly 2220 further includes front and rear sensor recesses 2272, 2277 and an upper shell 2270, (ii) a frontal shell 2228 having a different shape compared to frontal shell 228, and (iii) additional electronic devices included in an electronics assembly 2222 compared to electronic assembly 222.


Similar to the first embodiment and as shown in FIGS. 16-21, the housing assembly 2220 is configured to include hold the display 2300 and at least one sensor, such as a plurality of upper cameras 2302, of the electronics assembly 2222 mounted on an electronics support 2254 and aligned with the sensor recess 2272. The vision system 2301V and as such the plurality of upper cameras 2302 includes: (i) a first camera with a front left field of view FoVL and a light of sight LoSL, a second camera with a front middle field of view FoVM and a light of sight LoSM, and a third camera with a front right field of view FoVR and a light of sight LoSR. The lights of sight are parallel with one another and perpendicular to the frontal plane PF in a first direction, while being slightly angled relative to the frontal plane PF in a second direction. Finally, the vision system 2301V and as such the plurality of upper cameras 2302 also include a rear camera that has a field of view FoVRR and a light of sight LoSRR, wherein said light of sight LoSRR is perpendicular the rear plane PR in both directions.


As shown in these Figures, the front shell 2228 is configured to cover a front portion of the electronics assembly 2222 to locate at least a portion of the electronics assembly 2222 between a front cover 2252 and the frontal shell 2228. However, unlike the first embodiment, the housing assembly 2220 further includes the upper shell 2270. The inclusion of this upper shell 2270 was primarily caused by the modification of the housing assembly 2220 to include a front and rear sensor recesses 2272, 2277 that are positioned between the front and rear of the housing assembly 2220. The frontal sensor recess 2272 provides an inset region of the head portion 2202a, which beneficially minimize potential distortion that may be caused by the curvilinear design of the front shell in the first embodiment. Thus, the vision system 3301V can forgo using complex algorithms to remove or attempt to remove the distortion caused by said curvilinear shield.


The frontal sensor recess 2272 is positioned above the display and is designed to provide an inset region relative to the frontal shell 2228, thereby enabling secure integration and optimal functionality of the embedded sensors. Said frontal sensor recess 2272 includes a shelf 2284, overhang 2285, planar sensor cover 2286, and side walls 2287 and 2288, as illustrated in FIGS. 17 and 18. The sensor recess 2272: (i) is positioned above an substantial extent of the frontal shell 2228 and below the upper shell 2270, and (ii) forms a cavity that provides an unobstructed field of view for its upper cameras 2302 or other sensors. The shelf 2284 extends forward and downward from the planar sensor cover 2286 at a predefined angle, wherein said angle has been determined to avoid interference with the operational range of the sensors. This design of the shelf 2284 causes it to provide a lower bound of the above described cavity. It should be understood that in other embodiment the shelf 2284 may not be angled and instead it may be perpendicular, or substantially perpendicular, to the planar sensor cover 2286


The overhang 2285, formed by the forward projection of the upper shell 2270 beyond the planar sensor cover 2286, serves both functional and protective roles and provides an upper bound of the above described cavity. Said overhang 2285 may act as a shield against environmental factors such as direct sunlight, precipitation, or debris, thereby enhancing the durability and reliability of the sensors (e.g., cameras). Additionally, the contoured surface connecting the shelf 2284 to the overhang ensures a seamless transition, reducing sharp edges or discontinuities that could compromise structural integrity or aesthetics appearance.


The recess 2272 is designed to ensure that the embedded sensors (e.g., cameras) remain optimally aligned within the recess 2272 and with the planar sensor cover 2286. The planar sensor cover 2286 serves as a optical window for the sensors (e.g., cameras) and is engineered to minimize distortion or interference. For example, the cover 2286 may be constructed from optical-grade transparent material if it houses visual sensors, ensuring minimal attenuation or aberration. The side walls 2287 and 2288, which enclose the recess 2272 and therefor provide the lateral bounds for the above described cavity, provide structural rigidity and additional protection against lateral impacts. The curvilinear border 2278, which extends from the edge of the shelf 2284 to the upper shell 2270, enhances the visual and functional integration of the recess into the housing assembly 2220. This border not only defines the spatial limits of the recess but also contributes to the overall structural cohesiveness, distributing mechanical stresses evenly across the shell. Furthermore, the continuous surface of the upper shell 2270, extending seamlessly from its rear edge to the sensor recess, ensures aerodynamic and structural consistency, making the design suitable for high-speed applications.


The structural relationship between the sensor recess 2272 and the frontal shell 2228 is designed to achieve both physical robustness and functional harmony. The continuous surface of the upper shell 2270, extending from its rear edge to the sensor recess, creates a unified framework that enhances the structural rigidity of the assembly. This integration ensures that mechanical stresses, such as those resulting from impacts or vibrations, are distributed across the entire housing assembly, minimizing localized deformation or failure. Additionally, the shelf 2284 is configured to interlock with the frontal shell 2228 through complementary geometries, such as grooves or notches, ensuring precise alignment and a secure fit. This interlocking mechanism enhances the structural bond between the two components, preventing dislodgement during operation. The overhang 2285 further strengthens this relationship by providing a protective canopy that shields the junction between the recess and the frontal shell, mitigating environmental exposure and mechanical wear.


The sensor recess 2272, along with its associated components, is typically fabricated using high-strength, lightweight materials that meet the dual requirements of mechanical durability and environmental resilience. These materials may include advanced polymers, such as polycarbonate or acrylonitrile butadiene styrene (ABS), or lightweight metal alloys like aluminum or titanium, depending on the application. For applications demanding enhanced electromagnetic shielding, the sensor recess 2272 and surrounding structures may be coated with conductive materials or integrated with Faraday cage-like features to minimize interference. The manufacturing process may involve precision injection molding for polymer components or CNC machining for metallic elements to achieve the tight tolerances necessary for aligning the recess with the sensor cover 2286 and ensuring compatibility with the broader housing assembly 2220. Surface treatments such as anodizing, powder coating, or weather-resistant coatings may be applied to enhance longevity and resistance to wear and environmental exposure. In cases where optical sensors are employed, the planar sensor cover 2286 may be fabricated from optical-grade materials with anti-reflective coatings, ensuring clear transmission and minimal distortion of incoming signals.


The sensor cover 2286 is positioned between the overhang 2285 and the shelf 2284 and is substantially parallel with lenses of the cameras 2302 located behind the sensor cover 2286. The sensor cover 2286 made of a material that does not obscure a signal detected by the sensor(s). For example, the sensor cover 2286 may be a planar cover made of a transparent material that allows the upper cameras 2302 to receive images, preferably undistorted images. Additionally and/or alternatively, the sensor cover 2286 may have openings formed therein for receiving an extent of a sensor (e.g., camera lens). In further embodiments, the upper sensor recess 2272 may be omitted and the frontal shield 2228 may include openings formed therein for receiving an extent of a sensor (e.g., camera lens). The sensor cover 2286 can include one or more sensor openings 2304 set back or recessed from the front of the housing assembly 2220. The sensor openings 2304 are positioned to correspond with the upper camera(s) 2302 of the electronics assembly 2222. The sensor openings 2304 are partially protected by an overhang of the upper shell 2270 that protrudes over the sensor openings 2304.


To further ensure that the frontal shell 2228 does not distort the field of view of the vision system 3301V, said frontal shell 2228 includes a main body 2228a, two wing-like projections 2228b, 2228c that extent upward from the main body 2228a, and a central notch 2274 that is positioned between the two wing-like projections 2228b, 2228c. As such, the frontal shell 2228 is designed to at least partially surrounds (e.g., on three sides) and/or conforms to the sensor recess 2272. Surrounding or conforming to the sensor recess 2272 is primarily achieved by allowing the wing-like projections 2228b, 2228c to flank the upper sensor recess 2272 and have extents positioned adjacent to the curvilinear border 2278 of the upper sensor recess 2272. Additionally, viewed from the side, the frontal shell 228 may have a rearwardly sloping substantially linear edge 2276 with a forward angle (e.g., extending rearward from horizontal) between 90 degrees and 140 degrees, preferably 110 degrees from horizontal when the robot is in a normal vertical position (sagittal and coronal planes of the head are aligned with the sagittal and coronal planes of the robot).


Similar to the first embodiment, the frontal shell 2228 is shaped to resemble the form of the head 2202a providing a substantially continuous surface between the upper shell 2270 and sensor recess 2272 to the rear shell 2234. The curvature of the frontal shell 2228 may vary and have different curvatures (i.e. radii and arcs) at different positions along the frontal shell 2228. The frontal shell 2228 may include light recesses 2324 to conform with the shape of light emitter housings 2262a, 2262b, 2262c, 2262d. As shown in FIGS. 16-21, the frontal shell 2228 may also include a curved upper surface or rim 2292 defining the slot 2274 and having a shape that substantially matches the curvilinear border 2278. The curved upper surface or rim 2292 of the frontal shell 2228 serves multiple purposes in the design of the humanoid robot's head. By defining the slot 2274 and matching the shape of the curvilinear border 2278, it creates a seamless transition between the main body 2228a and the wing-like projections 2228b, 2228c. This design element not only enhances the aesthetic appeal of the robot's head but also contributes to its functionality. The curved rim 2292 helps to channel any potential debris or moisture away from the sensor recess 2272, providing an additional layer of protection for the sensitive components housed within. Its shape also aids in reducing wind resistance and minimizing turbulence around the head during movement, which could potentially interfere with sensor readings or cause unwanted vibrations. Furthermore, the curvature of the rim 2292 complements the overall ergonomic design of the head, creating a more natural and less intimidating appearance for human interaction.


The rear sensor recess 2277 is a notable addition in the second embodiment, providing enhanced functionality to the head design. This recess may acting as either a viewport for a rear-facing camera 2305 or as a ventilation system. As a viewport, it allows the robot to have visual awareness of its surroundings behind it, potentially improving navigation and obstacle avoidance capabilities. This feature could be particularly useful in crowded or dynamic environments where threats or objects of interest may approach from behind. When utilized as a vent, the rear sensor recess 2277 can aid in thermal management of the head's internal components. By allowing airflow in and out of the head 2202a, it facilitates cooling of the electronics assembly 2222. This cooling can be passive, relying on natural convection, or active, utilizing one or more internal fans to force air circulation. Effective thermal management is essential for maintaining optimal performance and longevity of the electronic components, especially in scenarios where the robot may be operating for extended periods or in high-temperature environments.


The structural design of the rear sensor recess 2277 mirrors, or substantially mirrors, that of the front sensor recess 2272, featuring a shelf 2294, an overhang 2295, a sensor cover 2296, and a pair of side walls or edges 2297, 2298. This symmetry in design not only provides a cohesive aesthetic but also ensures consistent protection and functionality for both front and rear sensors. The shelf 2294 not only deflects debris and liquids, but may also reduce glare and unwanted light reflections that could interfere with camera functionality or sensor readings. This design element is particularly beneficial in outdoor or brightly lit environments where light interference could compromise the accuracy of visual data collection or other sensor measurements. Additionally, the shelf 2294 may acts as a natural heat sink, helping to dissipate thermal energy generated by the sensors or other electronic components housed within the recess. The sensor cover 2296 that is positioned within the rear sensor recess 2277 may feature anti-reflective coatings or polarizing filters to enhance image quality. If used for ventilation, the cover 2296 could be designed with hydrophobic and oleophobic properties to repel water and oil, maintaining clear airflow even in challenging environments. Also and beyond reducing glare and or aiding in channeling airflow into the housing, the side walls or edges 2297, 2298 may also electromagnetic shielding, and/or vibration-dampening materials.


2. Third Embodiment

Similar to the head and neck assemblies 202, 2202 described above in connection with FIGS. 1-21, FIGS. 22—illustrate a third embodiment of a head and neck assembly 3202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202 will not be repeated below, but it should be understood that like numerals represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 3300. Further, it should be understood that the functionality and operation of the head and neck assembly 3202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202. It should be understood that any one or more features of the head and neck assemblies 202, 2202 may be used in combination with those disclosed with respect to the head and neck assembly 3202, and that any one or more features of the head and neck assembly 3202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202. The primary difference between head and neck assembly 2202 and head and neck assembly 3202 is the fact that head 3202a does not include the rear sensor recess 2277, includes a lower sensor cover 3310, includes a deformable neck shell 3230, and includes an electronics assembly 3222 with additional devices and components used in the operation of the robot.


i. Upper Shell


The head 3202a includes an upper shell 3270 having a recessed sensor region 3272 at a front end of the upper shell similarly to the housing assembly 2220 of the second embodiment. However, a rear end of the upper shell 3270 is substantially flush with the rear shell 3234 such that there is no recessed region in this area. In some examples, the upper shell 3270 may further include a recessed area 3280. The recessed area 3280 may be configured to hold a top shell 3282 included in the upper shell 3270 and having a shape conforming to the shape of the recessed area 3280.


ii. Neck Shell


The disclosed head and neck assembly 3202 may include one or more actuators 140, 802 that allow the head to: (i) twist or rotate, and (ii) tilt or change the pitch. Unlike conventional robots, the actuators are hidden underneath a deformable neck shield 3230. Movement of the actuators causes the deformable neck shell 3230 to deform and accommodate such movements. The deformable neck shield 3230 is designed to extend to the jaw line of the head enclosure and into a rear extent of the head enclosure and does not extend into the side regions of the head. This configuration ensures that the neck shield 3230 is sufficiently attached to the head, but minimizes the head's surface area covered by the deformable neck shield 3230. Minimizing the coverage of the deformable neck shield 3230 in the side regions of the head allows for the inclusion of more durable materials in these regions without using overlapping materials. This is beneficial over conventional robot heads because it reduces materials and/or increases the lateral protection for the electronics contained within the head.


The deformable neck shield 3230 may be constructed from a variety of flexible and durable materials to accommodate the dynamic movements of the humanoid robot's head 3202a (e.g., twisting and pitching of the head), while maintaining a sleek and functional appearance (e.g., not pinching or bunching). Suitable materials for the deformable neck shield 3230 may include stretchable fabrics such as spandex, neoprene, or polyester blends, which offer elasticity and resilience. Alternatively, deformable plastics, such as thermoplastic elastomers (TPE), silicone, or polyurethane, may be employed to provide both flexibility and durability. The neck shell 3230 may feature a multi-layered construction, where an inner layer provides comfort and protection to internal components, while an outer layer enhances the aesthetic appeal and protects against environmental factors such as dust, moisture, and UV radiation. In some embodiments, the deformable neck shield 3230 may incorporate reinforced sections or embedded support structures to ensure durability in high-stress areas, such as at the base or connection points. These reinforcements could be achieved through the integration of flexible mesh fabrics, carbon fiber inserts, or Kevlar-like materials to prevent wear and tear over prolonged use. Additionally, the deformable neck shield 3230 may include a memory fabric or shape-retentive polymer that helps it return to its original state after deformation, ensuring consistent performance and appearance.


To further enhance its functionality, the neck shell 3230 may be designed with a segmented or accordion-like structure, allowing for smoother and more controlled movements in all directions without creating folds or creases that could interfere with the robot's appearance or performance. The design may also include strategically placed ventilation holes or breathable sections to prevent heat buildup within the neck area, particularly when the robot operates for extended periods. Variations of the neck shell 3230 may also include customizable surface finishes, such as matte, glossy, or textured coatings, to align with the robot's intended use case or aesthetic requirements. In some cases, the neck shell 3230 may be treated with hydrophobic or anti-static coatings to improve its resistance to environmental contaminants. Additionally, the neck shell 3230 may incorporate sensors, such as strain gauges or pressure sensors, to monitor the stress and strain experienced during head movements, providing feedback to the robot's control system for more precise motor adjustments. In certain implementations, the neck shell 3230 may also be modular or interchangeable, allowing for easy replacement or customization. For example, different neck shell designs could be used depending on the environment in which the robot operates—a rugged, weather-resistant version for outdoor use, or a sleek, soft-touch version for indoor customer interactions. The modular design could also facilitate quick repairs and maintenance, ensuring minimal downtime for the robot.


iv. Electronics Support Frame


The frontal shell 3228 and the display 3300 are mounted to an electronics support frame or shielded portion 3288, wherein said electronics support frame 3288 is essentially a combination of the intermediate cover and electronics support of the first embodiment. The shielded portion 3288 may include a substantial surface that extends from a curvilinear border 3290 that surrounds an extent of the upper shell 3270 and sensor recess 3272 to a rim 3292. The shielded portion 3288 may include a display opening 3294 positioned to receive the display 3300 mounted. The display 3300 may be rectangular with a curvature. The shielded portion 3288 may be shaped with contours around the display opening 3294 to receive the curved shape of the display 3300 without obstructing the view. The shielded portion 3288 may also have a taper and/or include additional contours between the display opening 3294 and the rim 3292. The rim 3292 may include lighting recesses or light emitter housings 3262a, 3262b, 3262c, 3262d formed within the rim 3292 to receive light emitting assemblies 3264a, 3264b, 3264c, 3264d of the electronics assembly 3222. Although the illustrative embodiment shows the frontal shell 3228 sized to fit within the shielded portion 3288, the frontal shell 3228 may occupy any portion or ratio of the robot's head and may have any configuration, or may be omitted. In some embodiments, the frontal shell 3228 is optional.


As shown in at least FIGS. 28-30, the components of the electronics assembly 3222 may be mounted to an internal mounting frame 3806 configured to arrange the individual items in a space-saving manner. The internal mounting frame 3806 includes at least an actuator mount 3808, an upper sensor mount 3810, and a lower sensor mount 3812. The actuator mount 3808 couples to the housing of the head nod actuator assembly 3140 and is configured such that the entire internal mounting frame 3806 and attached components move correspondingly with the position of the actuator. As noted above, the housing 3220 is configured to enclose the electronics assembly 3222 without interfering with the transmission or reception of signals. For example, the housing 3220 does not obscure the line of sight or field of view of the sensors.


v. Sensor Assembly and Lower Recess


The sensor assembly 3700 may include a variety of sensing devices and systems to enhance the humanoid robot 100's perception capabilities and adaptability in various environments. The sensor assembly 3700 may include: (i) a vision system 3301V with one or more cameras 3302a, 3302b, 3304a, 3304b, (ii) temperature sensors to detect ambient or object temperatures for safety and operational adjustments, (iii) pressure sensors to measure contact or surface pressures, (iv) force sensors for detecting applied forces during interactions, (v) inductive sensors for proximity and metal object detection, (vi) capacitive sensors to sense touch or proximity, (vii) any combination of these sensors, or (viii) other known sensors including ultrasonic, acoustic, or gas sensors for additional environmental monitoring. As shown in FIGS. 33, 34, and 56, the sensor assembly 3700, and specifically the vision system 3301V, may include a set of upper cameras 3302a, 3302b and a set of lower cameras 3304a, 3304b coupled to the internal mounting frame 806 at respective mounting positions. For example, two upper cameras 3302a, 3302b may be positioned above the display/shield 3300 and directed forward, and two lower cameras 3304a, 3304b may be positioned below the display 3300 and at a downward and forward angle. As shown in FIGS. 28 and 29, the upper cameras 3302 and the lower cameras 3304 may be positioned at different angles to provide a fuller field of view.


As shown in the Figures, the lower cameras 3304 may have a generally downward facing field of view, which allow the robot 100 to gain awareness of its immediate surroundings on the ground. This allows for detection of obstacles, uneven terrain, or other potential hazards that could impede the robot's movement or cause instability. Said field of view is about 57 degrees to about 86 degrees, preferably about 71 degrees to about 79 degrees. For example, the maximum field of view of the lower cameras 3304 may be about 72 degrees. Further, when the robot 100 is forward facing in an initial position, the torso 16 interferes with a portion of the field of view of the lower cameras 3304. In this position, the lower cameras 3304 have a downward facing field of view FoVLB, FoVRB in front of the robot 100 of about 33 degrees to about 49 degrees, preferably about 40 degrees to about 45 degrees. For example, the field of view FoVLB, FoVRB of the lower cameras 108.2.4 may be about 41 degrees. Also, as shown in FIG. 28 the line of sight LoSLB, LoSRB of the bottom cameras 3304a, 3304b is angled relative to a frontal plane PF and is angled relative to the line of sight LoSLT, LoSRT of the top cameras 3302a, 3302b. The bottom angle hat is formed between the bottom line of sight LoSLB, LoSRB and the frontal plane PF may be any value between 91 degrees and 180 degrees, preferably between 110 degrees and 150 degrees, and most preferably 130 degrees. The LoS angle that is formed between the top and bottom lines of sight LoSLB. LoSRB, LoSLT. LoSRT may be equal to the bottom angle may be equal, and thus, the top line of sight LoSLT. LoSRT would be perpendicular to the frontal plane PF. In other embodiments, the top line of sight LoSLT. LoSRT may not be perpendicular to the frontal plane PF; thus, the LoS angle would not be equal to the bottom angle.


As can be understood, the field of view of the lower cameras 3304 may vary due to the relative position of the head 10 and torso 16. For example, when the head 10 is tilted backward, the field of view of the lower cameras 3304 may increase. When the head 10 is tilted forward, the field of view of the lower cameras 3304 may decrease. When the head 10 is twisted to the left or right, the field of view of the lower cameras 3304 change relative to any interference of the torso 16 or other components of the robot 1. Likewise, the angle between the bottom line of sight LoSLB, LoSRB and the frontal plane PF will change depending on the position of the head. Additionally, it should be understood that the lower camera disclosed herein may have or include any properties, components, or elements that are described in other sections herein. For example, most, if not all, of the disclosure contained above in connection with sensor assembly 301, vision system 301V, or upper cameras 302a, 302b, 302c applies in equal force here.


In addition, the head 3202a further includes a lower recess 3309 formed in a chin region 3355 of the head 3202a, which may include a lower sensor cover 3310 that is positioned below the display 3300 and is angled downward. As shown in the Figures, the lower camera 3304 are pointed towards said lower sensor cover 3310. The chin region 3355 projects outward away from the neck 3202b to provide a lower surface 3311 facing toward ground G. The lower sensor cover 3310 is coupled to the lower surface and faces toward the ground. The lower sensor cover 3310 is shaped couple with the head 3202a in a position such that a sensor opening 3312 included in the lower sensor cover 3310 corresponds with a sensor, such as a lower camera(s) 3303 of the electronics assembly 3222. The transparent material used for the lower sensor cover 3310 is crucial for maintaining sensor accuracy and functionality. By allowing undistorted transmission of signals or images, it preserves the fidelity of data collected by the lower camera 3303 or other sensors that may be housed in this area. This could include not just visual data, but potentially other sensor types like infrared for heat detection or ultrasonic for precise distance measurements to the ground. For example, the lower sensor cover 3310 may be a planar cover made of a transparent material that allows the lower cameras 3303 to receive images, preferably undistorted images. Additionally or alternatively, the lower sensor cover 3310 may be enclosed by the frontal shell 3228 the neck shell 3330.


vi. Display


As best shown in FIGS. 28 and 29, the display 3300 of the electronics assembly 3222 may be mounted to the internal mounting frame 3806 and positioned between the upper and lower cameras 3302, 3303. The display 3300 is operatively connected to at least one processor to generate and display status messages and other information on the display 108.4. As described above in connection with the first embodiment, the display 3300 may display information: (i) related to the robot's state (e.g., working, error, moving, etc.), (ii) obtained from sensors contained within the head assembly, or (iii) received from other processors in communication with the display 3300 (e.g., other internal processors housed within the robot or external information transmitted and received by the robot). Said information may be displayed in the format of blocks, well-known shapes, logos, or other moving items (e.g., thought bubbles). Additionally, it should be understood that most, if not all of the disclosure in connection with display 300 applies in equal force to display 3300 and as such will not be repeated herein.


vii. Illumination Assemblies


As best shown in FIGS. 44-49, each of the light emitter housings 3262 of the illumination assembly 223 has a generally trapezoidal shape when viewed from the side. Each of the diffusers 3386 has a corresponding shape and is configured to reside within a slot 3388 formed in each respective light emitter housing 3262 between the top wall 3380 and the bottom wall 3382. Each of the light emitters 3264 includes a cover 3900 coupled to the diffuser 3386, a light source 3902 oriented to face toward the cover 3900 and the diffuser 3386, and a back wall 3904 coupled to the light source 3902 and mounted with the light emitter housing 3262. The cover 3900 and the back wall 3904 form a protective covering for the light source 3902 to protect the light source 3902 from debris and other foreign objects from reaching the light source 3902. The light source 3902 is illustratively a light emitting diode, but can include any suitable light source such as an incandescent source, a fluorescent source, a halogen source, etc. The back wall 3904 has a greater width than the cover 3900 to project outwardly beyond the cover and the light source 3902 for attachment to the light emitting housing 3262. Additionally, it should be understood that most, if not all of the disclosure in connection with illumination assembly 223 from the first embodiment applies in equal force to illumination assembly 3223 and as such will not be repeated herein.


viii. Other Electronic Components


As described above, the electronic components of the head may also include a directional microphone 3814, speaker 3816, antennas 3818, indicator lights 3264, as well as a data storage device 3819 and/or computing device 3821 comprising a processor and memory as shown in FIGS. 28-30. Specifically, the directional microphone 3814 is designed to detect sounds and determine a position, which enables the robot to move its head toward the sound. In particular, one or more speakers 3816 may be configured to allow the robot to communicate with nearby humans with audible messages or responses. One or more antennas 3818 may be configured to transmit and receive data wirelessly for data transfer into and out of the robot. Specifically, said robot may include wireless communication modules (e.g., cellular, Wi-Fi, Bluetooth, WiMAX, HomeRF, Z-Wave, Zigbee, THREAD, RFID, NFC, and/or etc.) that are connected to said antennas 3818. For example, said robot head 3202 may include a 5G cellular radio coupled to one of the antennas 3818 and a Wi-Fi radio (e.g., 5 GHz or 2.4 GHz) coupled to the other antenna 3818.


The data storage device 3819 may include a solid-state hard drive designed to capture all of the data generated by the sensors or a subset of the data generated by the sensors. Said subset of the data may be time-based (e.g., the pre-defined time surrounding the start up/shut down of the robot), sensor-based (e.g., only encoder data), movement/configuration-based (e.g., when performing a specific task that requires the robot to put its body in a particular position/configuration), environment-based (e.g., when the robot recognizes a specific item or issue in its environment), or configuration based, error based, or a combination thereof. In addition, the data storage device may be used to store data to train other robots or store data for diagnostic purposes or any other purpose. Finally, the indicator lights 3264 may be designed to work with the screen 3300 to indicate a state of the robot 100 (e.g., working, error, moving, etc.) to a nearby human or may illuminate for other reasons.


3. Fourth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202 described above in connection with FIGS. 1-49, FIG. 50 illustrates a fourth embodiment of a head and neck assembly 4202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 4300. Further, it should be understood that the functionality and operation of the head and neck assembly 4202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202 may be used in combination with those disclosed with respect to the head and neck assembly 4202, and that any one or more features of the head and neck assembly 4202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202.


The primary difference between head and neck assembly 3202 and head and neck assembly 4202 is the fact that the shielded portion 4288 is ribbed and the head 4202a lacks a lower recessed sensor region in the chin area. As shown in FIGS. 58 and 59, the shielded portion 4288 can include a plurality of ribs 4372, 4374 that define one or more recessed areas 4370. The ribs 4372, 4374 can provide additional structural reinforcement to areas adjacent the electronics assembly 4222. The recessed areas 4370 can be a part of the shielded portion 4288 or can be additional displays and/or lights that are used to convey information and indications to users. Although not shown in FIGS. 58 and 59, the head 4202a can include or omit a frontal shell.


4. Fifth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202 described above in connection with FIGS. 1-50, FIG. 51 illustrates a fifth embodiment of a head and neck assembly 5202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 5300. Further, it should be understood that the functionality and operation of the head and neck assembly 5202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202 may be used in combination with those disclosed with respect to the head and neck assembly 5202, and that any one or more features of the head and neck assembly 5202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202.


The primary difference between head and neck assembly 3202 and head and neck assembly 5202 is the fact that the head 5202a includes only one light emitting assembly 5264 and the head 5202a lacks a lower recessed sensor region in the chin area. As shown in FIG. 60, the light emitting assembly 5264 is a band and extends along the interface 5320 between the frontal shell 5228 and the rear shell 5234. The light emitting assembly 5264 extends along the entire interface 5320 in the illustrative embodiment, but may extend only partway along the interface 5320 in some embodiments. In other words, the light emitting assembly 5264 may extend completely around the frontal shell 5228. The outer surface of the light emitting assembly 5264 may be flush with the outer surfaces of the front and rear shells or may be recessed relative to one or both of the front and/or rear shells.


5. Sixth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202 described above in connection with FIGS. 1-51, FIG. 52 illustrates a sixth embodiment of a head and neck assembly 6202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 6300. Further, it should be understood that the functionality and operation of the head and neck assembly 6202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202 may be used in combination with those disclosed with respect to the head and neck assembly 6202, and that any one or more features of the head and neck assembly 6202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202. The primary difference between head and neck assembly 6202 and head and neck assembly 6202 is the fact that the head 6202a lacks a front recessed sensor region, a rear recessed sensor region, and a lower recessed sensor region as in embodiments 2-5. Like the fifth embodiment, the head 6202a includes only one light emitting assembly 6264 that extends along the interface 6320 between the frontal shell 6228 and the rear shell 6234.


The primary difference between head and neck assembly 6202 and head and neck assembly 6202 is the fact that the head 6202a lacks a front recessed sensor region as in embodiments 2-5. Like the fifth embodiment, the head 6202a includes only one light emitting assembly 6264 that extends along the interface 6320 between the frontal shell 6228 and the rear shell 6234.


E. Seventh Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202 described above in connection with FIGS. 1-52, FIGS. 53 and 54 illustrate a seventh embodiment of a head and neck assembly 7202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 7300. Further, it should be understood that the functionality and operation of the head and neck assembly 7202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202 may be used in combination with those disclosed with respect to the head and neck assembly 7202, and that any one or more features of the head and neck assembly 7202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202. The primary difference between head and neck assembly 6202 and head and neck assembly 7202 is the fact the frontal shell 7228 provides substantially all of an exterior surface the head 7202a. As shown in FIG. 53, the head 7202a further includes a light emitting assembly 7264a coupled to the frontal shell 7228. The neck shell 7230 can also include a second light emitting assembly 7264b located adjacent to an edge of the frontal shell 7228 at a junction between the frontal shell 7228 and the deformable neck cover 7230.


As shown in FIG. 54, the head and neck assembly 7202 further includes an electronics assembly 7222 including a plurality of cameras 7302 configured to provide a field of view around all or a majority of the robot (i.e. 360 degrees). For example, the plurality of cameras 7302 includes a pair of frontal cameras 7302a, 7302b facing in a forward direction of the robot, a pair of side cameras 7302c, 7302d facing in opposed lateral directions of the robot, and a rear camera 7302e facing in a rearward direction of the robot.


F. Eighth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202 described above in connection with FIGS. 1-54, FIGS. 55 and 56 illustrate an eighth embodiment of a head and neck assembly 8202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 8300. Further, it should be understood that the functionality and operation of the head and neck assembly 8202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202 may be used in combination with those disclosed with respect to the head and neck assembly 8202, and that any one or more features of the head and neck assembly 8202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202.


The primary difference between previously described head and neck assemblies and head and neck assembly 8202 is the fact that the intermediate cover 8252 is thicker and provides a channel 8253 extending around or substantially around a perimeter of the frontal shell 8228. The head 8202 further includes a light emitting assemblies 8264a, 8264b, 8264c, 8264d coupled to the intermediate cover 8252 and spaced apart from one another along the channel 8253. Each of the light emitting assemblies 8264a, 8264b, 8264c, 8264d is substantially rectangular in shape and elongated in a direction that is perpendicular to the rear facing edge of the frontal shell 8228. The head 8202 further includes one or more light emitting assemblies 8264e located in the channel 8253 of the intermediate cover 8252.


G. Ninth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202 described above in connection with FIGS. 1-56, FIG. 57 illustrates a ninth embodiment of a head and neck assembly 9202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 9300. Further, it should be understood that the functionality and operation of the head and neck assembly 9202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202 may be used in combination with those disclosed with respect to the head and neck assembly 9202, and that any one or more features of the head and neck assembly 9202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202.


The primary difference between previously described head and neck assemblies and head and neck assembly 9202 is the fact that the frontal shell 9228 has a substantially constant width and extends from a chin region of the head 9202a all the way up to provide a crown region of the head 9202a. The head 9202a further include only a first light emitter assembly 9264a on a first lateral side of the frontal shell 9228 and a second light emitter assembly 9264b on an opposed second lateral side of the frontal shell 9228. The light emitter assemblies 9264a, 9264b have a larger length compared to other light emitter assemblies described herein.


H. Tenth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202 described above in connection with FIGS. 1-57, FIG. 58 illustrates a tenth embodiment of a head and neck assembly 10202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 10300. Further, it should be understood that the functionality and operation of the head and neck assembly 10202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202 may be used in combination with those disclosed with respect to the head and neck assembly 10202, and that any one or more features of the head and neck assembly 10202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202.


The primary difference between previously described head and neck assemblies and head and neck assembly 10202 is the fact that the head 10202a includes a secondary display 10301 located on each lateral side of the head 10202a. The secondary display 10301 is located specifically in an occipital region of the head 10202a. The head 10202a further includes light emitting assemblies 10264a, 10264b coupled to a lateral panel of the head 10202a as opposed to the intermediate cover in previous embodiments.


I. Eleventh Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202 described above in connection with FIGS. 1-58, FIGS. 59-60 illustrate an eleventh embodiment of a head and neck assembly 11202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 11300. Further, it should be understood that the functionality and operation of the head and neck assembly 11202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202 may be used in combination with those disclosed with respect to the head and neck assembly 11202, and that any one or more features of the head and neck assembly 11202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202.


The primary difference between previously described head and neck assemblies and head and neck assembly 11202 is the fact that the head 11202a includes a secondary display 11301 located on each lateral side of the head 11202a. The secondary display 11301 is located specifically in a parotid region and/or auricular region of the head 11202a. The head 11202a further includes light emitting assemblies 11264a, 11264b coupled to opposed sides of the frontal shell 11228.


J. Twelfth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202 described above in connection with FIGS. 1-60, FIGS. 61-62 illustrate a twelfth embodiment of a head and neck assembly 12202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 12300. Further, it should be understood that the functionality and operation of the head and neck assembly 12202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202 may be used in combination with those disclosed with respect to the head and neck assembly 12202, and that any one or more features of the head and neck assembly 12202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202.


The primary difference between previously described head and neck assemblies and head and neck assembly 12202 is the fact that the neck 12202b includes a secondary display 11301 coupled to a rear side of the neck 12202b so as to be visible from a rear side of the robot. The head 12202a further includes light emitting assemblies 12264a, 12264b coupled to opposed sides of the head 12202a generally in an auricular region of the head 12202a.


K. Thirteenth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202 described above in connection with FIGS. 1-62, FIGS. 63-64 illustrate a thirteenth embodiment of a head and neck assembly 13202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 13300. Further, it should be understood that the functionality and operation of the head and neck assembly 13202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202 may be used in combination with those disclosed with respect to the head and neck assembly 13202, and that any one or more features of the head and neck assembly 13202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202.


The primary difference between previously described head and neck assemblies and head and neck assembly 13202 is the fact that the neck 13202b includes a second display 13301 coupled to a lateral side of the head 13202a in an auricular region of the head 13202a. The head 13202a further includes a third display 13302 coupled to a front side of the neck 13202b so as to be visible from a front side of the robot. The third display 13302 may also be coupled to a chest region of the robot (i.e. torso 204).


L. Fourteenth Embodiment

Similar to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202, 13202 described above in connection with FIGS. 1-4, FIGS. 65A-65C illustrate a fourteenth embodiment of a head and neck assembly 14202. For the sake of brevity, the above disclosure in connection with the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 13202 will not be repeated below, but it should be understood that like numerals generally represent generally similar or like structures in the various embodiments. For example, the disclosure regarding display 300 applies equally to display 14300. Further, it should be understood that the functionality and operation of the head and neck assembly 14202 is similar or identical to the features and functionality disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202, 13202. It should be understood that any one or more features of the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202, 13202 may be used in combination with those disclosed with respect to the head and neck assembly 14202, and that any one or more features of the head and neck assembly 14202 may be used in combination with those disclosed with respect to the head and neck assemblies 202, 2202, 3202, 4202, 5202, 6202, 7202, 8202, 9202, 10202, 11202, 12202, 13202.


The primary difference between previously described head and neck assemblies and head and neck assembly 14202 is the fact that the head 14202a does not substantially mimic the shape of a human head and instead includes an upper electronics assembly 14222, a lower electronics assembly 14223, and a neck 14202b supporting the upper and lower electronics assemblies 14222, 14223. Each of the upper and lower electronics assemblies includes a housing 14900, 14902 and a plurality of cameras 14302, 14303 located in each respective housing 14900, 14902. The upper cameras 14302 are oriented substantially horizontally and perpendicular to a vertical plane 14904 (i.e. a frontal plane of the humanoid robot). The lower cameras 14303, and thus the line of sight LoS of the cameras, are angled downwardly at an angle 14906 to the frontal or vertical plane 14904, wherein said frontal or vertical plane 14904 is parallel with the coronal plane of the robot 1. The angle 14906 is, for example, within a range of about 120 degrees to about 140 degrees, and preferably 130 degrees. In the illustrative embodiment, each plurality of cameras 14302, 14303 includes three cameras having overlapping fields of view FoVT, FoVB, although in other embodiments any number of cameras can be used. Furthermore, other sensors or devices can be included in the housings 14900, 14902.


M. Body Illumination Assembly

As previously described, the head portion 202a includes light emitting assembly 264 located at the shield interface 320. The robot 100 can further include additional illumination assemblies in other areas of the robot 100 such as a torso illumination assembly 330, thigh illumination assembly 332, a neck illumination assembly 334, a shoulder illumination assembly 336, a hand/wrist illumination assembly 338, a knee illumination assembly 340, and a hip illumination assembly 342. The illumination assemblies disclosed herein meet the standards described in FDA CFR Title 21 part 1040.10, titled Performance standards for Light-Emitting Products, and ANSI LIA Z136.1, titled Safe Use of Lasers, at the time of filing this application and are fully incorporated herein by reference.


1. Torso Illumination Assembly

The torso illumination assembly includes at least one, and preferably a plurality of, light emitters 330 are located along a front surface of the torso 204 generally corresponding to a chest region of the robot 100 as shown in FIG. 66A. The torso light emitters 330 are illustratively embodied as elongated light strips; however, the lights 330 can have any suitable shape or structure. The torso light emitters 330 can be operated in the same way as the light emitting assembly 264 described above. The torso light emitters 330 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


2. Thigh Illumination Assembly

The thigh illumination assembly includes at least one, and preferably a plurality of, light emitters 332 are located along a front surface of each thigh 404a, 404b generally corresponding to a quadricep region of the robot 100 as shown in FIGS. 66B and 66C. The thigh light emitters 332 are illustratively embodied as elongated light strips; however, the light emitters 332 can have any suitable shape or structure. The thigh light emitters 332 can be operated in the same way as the light emitting assembly 264 described above. The thigh light emitters 332 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


3. Neck Illumination Assembly

The neck illumination assembly includes at least one, and preferably a plurality of, light emitters 334 is located along a rear surface of the neck portion 202b at the base of the head portion 202a as shown in FIG. 67. The neck light emitters 334 is illustratively embodied as an elongated light strip; however the light emitters 334 can have any suitable shape or structure. The neck light emitters 334 can be operated in the same way as the light emitting assembly 264 described above. The neck light emitters 334 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


4. Shoulder Illumination Assembly

The shoulder illumination assembly includes at least one, and preferably a plurality of, light emitters 336 are located along a front surface of each shoulder 206a, 206b as shown in FIG. 68A. The shoulder light emitters 336 are illustratively embodied as elongated light strips; however, the lights 336 can have any suitable shape or structure. The shoulder light emitters 336 can be operated in the same way as the light emitting assembly 264 described above. The shoulder light emitters 336 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


5. Hip Illumination Assembly

The hip illumination assembly includes at least one, and preferably a plurality of, light emitters 342 are located along a surface of each hip 602a, 602b as shown in FIG. 68B. The hip light emitters 342 are illustratively embodied as elongated light strips; however, the lights 342 can have any suitable shape or structure. The hip light emitters 342 can be operated in the same way as the light emitting assembly 264 described above. The hip light emitters 342 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


6. Hand Illumination Assembly

The hand illumination assembly includes at least one, and preferably a plurality of, light emitters 338 are located along a surface of each hand 216a, 216b as shown in FIG. 68B. The hand light emitters 338 are illustratively embodied as elongated light strips, however the lights 338 can have any suitable shape or structure. The hand light emitters 338 can be operated in the same way as the light emitting assembly 264 described above. The hand light emitters 338 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


7. Knee Illumination Assembly

The knee illumination assembly includes at least one, and preferably a plurality of, light emitters 340 are located along a surface of each knee 406a, 406b as shown in FIG. 68C. The knee light emitters 340 are illustratively embodied as elongated light strips, however the lights 340 can have any suitable shape or structure. The knee light emitters 340 can be operated in the same way as the light emitting assembly 264 described above. The knee light emitters 340 can include one or more of the following: LCD, LED, OLED, LPD, IMOD, QDLED, mLED, AMOLED, SED, FED, plasma, electronic paper or EPD, MicroLED, quantum dot display, LED backlit LEC, WLCD, OLCD, transparent OLED, PMOLED, capacitive touchdisplay, resistive touchdisplay, monochrome, color, or any combination of the above, or any other known technology or light feature.


N. Industrial Application

While the disclosure shows illustrative embodiments of a robot (in particular, a humanoid robot), it should be understood that embodiments are designed to be examples of the principles of the disclosed assemblies, methods and systems, and are not intended to limit the broad aspects of the disclosed concepts to the embodiments illustrated. As will be realized, the disclosed robot, and its functionality and methods of operation, are capable of other and different configurations and several details are capable of being modified all without departing from the scope of the disclosed methods and systems. For example, one or more of the disclosed embodiments, in part or whole, may be combined with a disclosed assembly, method and system. As such, one or more steps from the diagrams or components in the Figures may be selectively omitted and/or combined consistent with the disclosed assemblies, methods and systems. Additionally, those skilled in the art would recognize that many features of the implementation can be grouped together, split apart, reorganized, removed, or duplicated. Further, one or more steps from the arrangement of components may be omitted or performed in a different order. Accordingly, the drawings, diagrams, and detailed description are to be regarded as illustrative in nature, not restrictive or limiting, of the said humanoid robot.


While the above-described robot is designed as a head for use with a general-purpose humanoid robot, it should be understood that its assemblies, components, learning capabilities, and/or kinematic capabilities disclosed herein may be used with other robots. Examples of other robots include: articulated robot (e.g., an arm having two, six, or ten degrees of freedom, etc.), a cartesian robot (e.g., rectilinear or gantry robots, robots having three prismatic joints, etc.), Selective Compliance Assembly Robot Arm (SCARA) robots (e.g., with a donut shaped work envelope, with two parallel joints that provide compliance in one selected plane, with rotary shafts positioned vertically, with an end effector attached to an arm, etc.), delta robots (e.g., parallel link robots with parallel joint linkages connected with a common base, having direct control of each joint over the end effector, which may be used for pick-and-place or product transfer applications, etc.), polar robots (e.g., with a twisting joint connecting the arm with the base and a combination of two rotary joints and one linear joint connecting the links, having a centrally pivoting shaft and an extendable rotating arm, spherical robots, etc.), cylindrical robots (e.g., with at least one rotary joint at the base and at least one prismatic joint connecting the links, with a pivoting shaft and extendable arm that moves vertically and by sliding, with a cylindrical configuration that offers vertical and horizontal linear movement along with rotary movement about the vertical axis, etc.), self-driving car, a kitchen appliance, construction equipment, or a variety of other types of robot systems. The robot system may include one or more sensors (e.g., cameras, temperature, pressure, force, inductive or capacitive touch), motors (e.g., servo motors and stepper motors), actuators, biasing members, encoders, housing, or any other component known in the art that is used in connection with robot systems. Likewise, the robot 100 may omit one or more sensors (e.g., cameras, temperature, pressure, force, inductive or capacitive touch), motors (e.g., servo motors and stepper motors), actuators, biasing members, encoders, housing, or any other component known in the art that is used in connection with robot systems.


In other embodiments, other configurations and/or components may be utilized. As is known in the data processing and communications arts, a general-purpose computer typically comprises a central processor or other processing device, an internal communication bus, various types of memory or storage media (RAM, ROM, EEPROM, cache memory, disk drives etc.) for code and data storage, and one or more network interface cards or ports for communication purposes. The software functionalities involve programming, including executable code as well as associated stored data. The software code is executable by the general-purpose computer. In operation, the code is stored within the general-purpose computer platform. At other times, however, the software may be stored at other locations and/or transported for loading into the appropriate general-purpose computer system.


A server, for example, includes a data communication interface for packet data communication. The server also includes a central processing unit (CPU), in the form of one or more processors, for executing program instructions. The server platform typically includes an internal communication bus, program storage and data storage for various data files to be processed and/or communicated by the server, although the server often receives programming and data via network communications. The hardware elements, operating systems and programming languages of such servers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith. The server functions may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.


Hence, aspects of the disclosed methods and systems outlined above may be embodied in programming. Program aspects of the technology may be thought of as “products” or “articles of manufacture” typically in the form of executable code and/or associated data that is carried on or embodied in a type of machine-readable medium. “Storage” type media includes any or all of the tangible memory of the computers, processors or the like, or associated modules thereof, such as various semiconductor memories, tape drives, disk drives and the like, which may provide non-transitory storage at any time for the software programming. All or portions of the software may at times be communicated through the Internet or various other telecommunication networks. Thus, another type of media that may bear the software elements includes optical, electrical and electromagnetic waves, such as used across physical interfaces between local devices, through wired and optical landline networks and over various air-links. The physical elements that carry such waves, such as wired or wireless links, optical links or the like, also may be considered as media bearing the software. As used herein, unless restricted to non-transitory, tangible “storage” media, terms such as computer or machine “readable medium” refer to any medium that participates in providing instructions to a processor for execution.


A machine-readable medium may take many forms, including but not limited to, a tangible storage medium, a carrier wave medium or physical transmission medium. Non-volatile storage media include, for example, optical or magnetic disks, such as any of the storage devices in any computer(s) or the like, such as may be used to implement the disclosed methods and systems. Volatile storage media include dynamic memory, such as the main memory of such a computer platform. Tangible transmission media include coaxial cables, copper wire and fiber optics, including the wires that comprise a bus within a computer system. Carrier-wave transmission media can take the form of electric or electromagnetic signals, or acoustic or light waves such as those generated during radio frequency (RF) and infrared (IR) data communications. Common forms of computer-readable media therefore include for example: a floppy disk, a flexible disk, hard disk, magnetic tape, any other magnetic medium, a CD-ROM, DVD or DVD-ROM, any other optical medium, punch cards, paper tape, any other physical storage medium with patterns of holes, a RAM, a PROM and EPROM, a FLASH-EPROM, any other memory chip or cartridge, a carrier wave transporting data or instructions, cables or links transporting such a carrier wave, or any other medium from which a computer can read programming code and/or data. Many of these forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to a processor for execution.


It is to be understood that the invention is not limited to the exact details of construction, operation, exact materials or embodiments shown and described, as obvious modifications and equivalents will be apparent to one skilled in the art. While the specific embodiments have been illustrated and described, numerous modifications come to mind without significantly departing from the spirit of the invention, and the scope of protection is only limited by the scope of the accompanying Claims. It should also be understood that substantially utilized herein means a deviation less than 15% and preferably less than 5%. It should also be understood that other configuration or arrangements of the above-described components is contemplated by this Application. Moreover, the description provided in the background section should not be assumed to be prior art merely because it is mentioned in or associated with the background section. The background section may include information that describes one or more aspects of the subject of the technology. Finally, the mere fact that something is described as conventional does not mean that the Applicant admits it is prior art.


In this Application, to the extent any U.S. patents, U.S. patent applications, or other materials (e.g., articles) have been incorporated by reference, the text of such materials is only incorporated by reference to the extent that they do not conflict with materials, statements and drawings set forth herein. In the event of such conflict, the text of the present document controls, and terms in this document should not be given a narrower reading in virtue of the way in which those terms are used in other materials incorporated by reference. It should also be understood that structures and/or features not directly associated with a robot cannot be adopted or implemented into the disclosed humanoid robot without careful analysis and verification of the complex realities of designing, testing, manufacturing, and certifying a robot for completion of usable work nearby and/or around humans. Theoretical designs that attempt to implement such modifications from non-robotic structures and/or features are insufficient (and in some instances, woefully insufficient) because they amount to mere design exercises that are not tethered to the complex realities of successfully designing, manufacturing and testing a robot.

Claims
  • 1. A humanoid robot comprising: an upper region including: (i) a torso, (ii) a pair of arm assemblies coupled to the torso, and (iii) a head and neck assembly coupled to the torso and having a neck portion and a head portion coupled to the neck portion, and wherein the head portion includes: a head housing assembly including a rear shell and a frontal shell coupled to the rear shell to define a head volume between the frontal shell and the rear shell;an electronics assembly including a display located in the head volume between the frontal shell and the rear shell, anda sensor recess located above the display and configured to provide an inset region of the head portion relative to the frontal shell,wherein the electronics assembly further comprises a camera aligned with an extent of the sensor recess; anda lower region coupled to the upper region and spaced apart from the upper region, the lower region including a pair of legs.
  • 2. The humanoid robot of claim 1, wherein the display has a curvilinear screen configured to display icons that indicate an operating status of the humanoid robot, and wherein the curvilinear screen is not configured to display human-like facial features.
  • 3. The humanoid robot of claim 1, wherein the frontal shell includes: a first arc length at a first location below the display,a second arc length at a second location aligned with a portion of the display, wherein said second arc length is greater than the first arc length, anda third arc length at a third location above the display, wherein said third arc length is greater than both the first arc length and the second arc length.
  • 4. The humanoid robot of claim 3, wherein the frontal shell further includes a fourth radius of curvature at a fourth location above the third location, and wherein the fourth radius of curvature is less than both the second radius of curvature and the third radius of curvature.
  • 5. The humanoid robot of claim 1, wherein a rear edge of the frontal shell is spaced a distance from and does not abut a frontal edge of the rear shell at a location, and wherein an extent of the head portion is positioned behind both the frontal shell and the rear shell at said location to connect the frontal shell to the rear shell.
  • 6. The humanoid robot of claim 1, wherein the head portion further includes a centroid and a first substantially horizontal plane that extends through said centroid, and wherein the frontal shell includes: a first width defined between opposed edges of the frontal shell at a first location below said first plane,a second width defined between opposed edges of the frontal shell at a second location, wherein said second width is greater than the first width, anda third width defined between opposed edges of the frontal shell at a third location above said first plane, wherein said third width is greater than the first width and less than the second width.
  • 7. The humanoid robot of claim 7, wherein the head housing assembly further comprises a lower sensor cover located below the display, and wherein the electronics assembly further comprises a lower camera facing towards the lower sensor cover.
  • 8. A humanoid robot comprising: an upper region including: (i) a torso, (ii) a pair of arm assemblies coupled to the torso, and (iii) a head and neck assembly coupled to the torso and having a neck portion and a head portion coupled to the neck portion, and wherein the head portion includes: a first electronics assembly including a first camera having a first line of sight, anda second electronics assembly including a second camera having a second line of sight that is angled relative to the first line of sight, anda lower region coupled to the upper region and spaced apart from the upper region, the lower region including a pair of legs.
  • 9. The humanoid robot of claim 8, wherein the head portion further includes a display located vertically between the first electronics assembly and the second electronics assembly.
  • 10. The humanoid robot of claim 9, wherein the head portion further includes a head housing defining a head volume that receives the first and second electronics assemblies and the display.
  • 11. The humanoid robot of claim 8, wherein the head portion further includes: (i) an internal mounting frame having an actuator mount configured to mount the head portion to an actuator configured to control movement of the head portion relative to the torso about an axis, (ii) an upper sensor mount configured to mount the first electronics assembly to the head portion, and (iii) a lower sensor mount configured to mount the second electronics assembly to the head portion.
  • 12. The humanoid robot of claim 8, wherein the electronics assembly further includes an illumination assembly configured to illuminate a region that: (i) extends between a rear edge of the frontal shell and an extent of a frontal edge of the rear shell, (ii) is positioned adjacent to the extent of the rear edge of the frontal shell, and (iii) is positioned adjacent to the extent of the frontal edge of the rear shell, and wherein the light emitted from the illumination assembly obscures said extents of the head portion.
  • 13. The humanoid robot of claim 12, wherein a gap is formed in the region defined between the frontal shell and the rear shell, and wherein the light emitted from the illumination assembly is visible in said gap, and wherein an extent of the head portion is recessed relative to both: (i) a first location on the outer surface of the frontal shell that is positioned adjacent to the gap, and (ii) a second location on the outer surface of the rear shell that is positioned adjacent to said gap.
  • 14. A bipedal humanoid robot comprising: an upper region including: (i) a torso, (ii) a pair of arm assemblies coupled to the torso, and (iii) a head and neck assembly coupled to the torso and having a neck portion and a head portion coupled to the neck portion, and wherein the head portion includes: a head housing assembly including a rear shell and a frontal shell coupled to the rear shell to define a head volume between the frontal shell and the rear shell;an electronics assembly including a display located in the head volume between the frontal shell and the rear shell,an upper camera positioned in the head volume and located above the display, anda lower region coupled to the upper region and spaced apart from the upper region, the lower region including a pair of legs.
  • 15. The bipedal humanoid robot of claim 14, wherein the electronics assembly further includes an illumination assembly configured to illuminate a region that: (i) extends between a rear edge of the frontal shell and an extent of a frontal edge of the rear shell, (ii) is positioned adjacent to the extent of the rear edge of the frontal shell, and (iii) is positioned adjacent to the extent of the frontal edge of the rear shell, and wherein the light emitted from the illumination assembly obscures said extents of the head portion.
  • 16. The bipedal humanoid robot of claim 15, wherein the electronics assembly further includes a lower camera, and the lower camera is oriented towards a chin region of the head portion.
  • 17. The bipedal humanoid robot of claim 14, wherein the display has an upper end and an opposed lower end, and wherein the display has a constant height between the upper end and the lower end.
  • 18. The bipedal humanoid robot of claim 17, wherein the display has a constant arc length between the upper end and the lower end.
  • 19. The bipedal humanoid robot of claim 17, wherein the display occupies an ocular region, a nasal region, and an infraorbital region of the head portion.
  • 20. The bipedal humanoid robot of claim 19, wherein the display further occupies at least a portion of: a frontal region, a temporal region, and a zygomatic region of the head portion.
  • 21. The bipedal humanoid robot of claim 14, further comprising a lower camera oriented downwardly relative to the upper camera at an angle within a range of approximately 50 degrees to approximately 80 degrees.
  • 22. The bipedal humanoid robot of claim 14, wherein the upper camera is arranged to face in a forward direction of the head portion, and wherein the electronics assembly further includes a second upper camera and a third upper camera, the second upper camera being arranged to face in a first lateral direction oriented about 90 degrees to the forward direction, and the third upper camera being arranged to face in a second lateral direction about 180 degrees to the first lateral direction.
PRIORITY CLAIM AND CROSS-REFERENCE TO RELATED APPLICATIONS

This application is: (i) a continuation in part of U.S. Design patent application Ser. No. 29/935,680, which is a continuation in part of U.S. Design patent application Ser. No. 29/928,748, which is a continuation in part of U.S. Design patent application Ser. No. 29/889,764, and (ii) claims priority under 35 U.S.C. § 119 (e) to U.S. Provisional Patent Application Nos. 63/626,035, 63/564,741, 63/626,034, 63/626,037, 63/626,030, 63/566,595, 63/626,028, 63/573,528, 63/561,316, 63/634,697, 63/573,226, 63/707,949, 63/707,897, 63/707,547, 63/708,003, 63/626,105, each of which is expressly incorporated by reference herein in its entirety. Reference is hereby made to: (i) U.S. patents application Ser. Nos. 19/000,626, 19/006,191, 18/919,274, 18/919,263, (ii) PCT Application Nos. US/2025/10425, US/2025/11450, and (iii) U.S. Provisional Patent Application Nos. 63/557,874, 63/626,040, 63/625,362, 63/625,370, 63/625,381, 63/625,384, 63/625,389, 63/625,405, 63/625,423, 63/625,431, 63/685,856, 63/696,507, 63/696,533, 63/700,749, 63/614,499, 63/617,762, 63/561,315, 63/573,226, 63/615,766, 63/620,633, 63/706,768, each of which is expressly incorporated by reference herein in its entirety.

Provisional Applications (16)
Number Date Country
63626035 Feb 2024 US
63564741 Mar 2024 US
63626034 Mar 2024 US
63626037 May 2024 US
63626030 Feb 2024 US
63566595 Mar 2024 US
63626028 Feb 2024 US
63573528 Apr 2024 US
63561316 Mar 2024 US
63634697 Apr 2024 US
63573226 Apr 2024 US
63707949 Oct 2024 US
63707897 Oct 2024 US
63707547 Oct 2024 US
63708003 Oct 2024 US
63626105 Jan 2024 US
Continuation in Parts (2)
Number Date Country
Parent 29928748 Feb 2024 US
Child 19033973 US
Parent 29889764 Apr 2023 US
Child 19033973 US