The exemplary and non-limiting embodiments relate generally to position sensing and, more particularly, to a robot drive position sensor having an optical encoder.
U.S. patent publication Nos. 2009/0243413 A1 and 2015/0303764 A1, which are hereby incorporated by reference in their entireties, disclose a barrier between a non-optical encoder read-head and an encoder disk.
The following summary is merely intended to be exemplary. The summary is not intended to limit the scope of the claims.
In accordance with one aspect, an example embodiment is provided in an apparatus comprising a frame, where the frame is configured to be attached to a housing of a motor assembly proximate an aperture which extends through the housing; an optical sensor connected to the frame, where the optical sensor comprises a camera; and an environment separation barrier configured to be connected to the housing at the aperture, where the environment separation barrier is at least partially transparent and located relative to the camera to allow the camera to view an image inside the housing through the environment separation barrier and the aperture.
In accordance with another aspect, an example method comprises providing a read-head comprising a frame and a camera connected to the frame; connecting the read-head to a housing of a motor assembly, where the frame of the read-head is connected to the housing proximate an aperture which extends through the housing; and locating an environment separation barrier at the aperture to separate a first environmental area inside the housing from a second environmental area in which the camera is located, where the environment separation barrier is at least partially transparent and located relative to the camera to allow the camera to view an image inside the housing through the environment separation barrier and the aperture.
In accordance with another aspect, an example method comprises illuminating a reference member located inside a housing of a motor assembly by a light emitter of a read-head, where the read-head is located at least partially outside of the housing; viewing an image of the reference member by a camera of the read-head, where the camera is located at least partially outside of the housing, where the image is viewed by the camera though an aperture in the housing and through a transparent environment separation barrier located at the aperture, where the transparent environment separation barrier seals a first environment inside the housing from a second environment in which the sensor is located, and where the camera is located outside of the first environment and the transparent environment separation barrier allows the camera to view the image coming from inside the housing while the camera is outside of the first environment.
The foregoing aspects and other features are explained in the following description, taken in connection with the accompanying drawings, wherein:
Referring to
In addition to the substrate transport apparatus 12, the substrate processing apparatus 10 includes multiple substrate processing chambers 14 and substrate cassette elevators 16 connected to a vacuum chamber 15. The transport apparatus 12 is located, at least partially, in the chamber 15 and is adapted to transport planar substrates, such as semiconductor wafers or flat panel displays, between and/or among the chambers 14 and elevators 16. In alternate embodiments, the transport apparatus 12 could be used in any suitable type of substrate processing apparatus.
A conventional vacuum environment robotic manipulator typically includes a drive unit which houses all active components of the robotic manipulator, e.g., actuators and sensors, and one or more arms, as discussed above, driven by the drive unit. The arm(s) are typically passive mechanisms, i.e., they do not include any active components, such as actuators and sensors. This is primarily due to difficulties with out-gassing, power distribution and heat removal in vacuum environments.
Referring also to
Although the substrate transport apparatus 12 is described with respect to a vacuum robot, any suitable substrate transport apparatus; atmospheric or otherwise may be provided having features as disclosed. Substrate transport apparatus 12 has a controller 54, the drive unit 18 and the arm 20, and is configured to transport substrate S. Controller 54 may have at least one processor 32, at least one memory 34 and software or computer code 36 configured to control the drive 18 and process input from the sensors. Arm 20 is shown as a SCARA type arm and driven by drive unit 18, but in alternate embodiments any suitable arm could be provided. Although substrate transport apparatus 12 is described with respect to a two link arm, any suitable number of links may be provided. Further, any suitable number of arms may be provided. Further, any combination of rotary and/or linear axis may be provided on any suitable arm.
The drive 18 forms a robot motor assembly. In this example the robot motor assembly comprising stators and rotors configured to drive shafts connected to the first link 22 and one of the pulleys in the first link. Referring also to
Referring also to
Referring also to
In this example embodiment the environment separation barrier 62 is a transparent window and the optical sensor 60 is a camera. A holder 70 is provided to hold the transparent window 62. The holder 70 is connected by the connection 66 to the frame 56. The holder 70 is biased by the connection 66 in the direction of the aperture 48 to press the transparent window 62 against the seal 72. Thus, the seal 72 and the window 62 seal off the aperture 48 forming an optically transparent barrier between the two environmental areas 40, 42 at the aperture 48. Because the barrier 62 is optically transparent, the camera 60 is still able to view an image from the reference member 44. Because the components of the read-head 50, including the camera 60, light emitter 58 and electronic circuitry 59, are all outside the environmental area 40, there is no risk of outgassing from these components inside the area 40. Because the components of the read-head 50, including the camera 60, light emitter 58 and electronic circuitry 59, are all outside the environmental area 40, no special design or encasement of the read-head 50 or its components is necessary.
With features as described herein, a position encoder may be incorporated into a robot direct-drive module, such as drive 18 or one of a plurality of drive modules which are assembled to form the drive 18. A position encoder track, such as on the disk 44, may be coupled to the driven part 46 of the direct-drive module, and a position read-head may be provided on the outside of the housing 38 of the direct-drive module, for example, as shown in the example embodiments shown in the drawings.
In one exemplary embodiment, as depicted diagrammatically in
The encoder track on the reference member 44 may include features that may be utilized to sense location of the encoder track. As an example, the features may form an incremental track and, in some case, the incremental track may be complemented by an absolute track. As another example, the features may form a pattern that may be decoded using image processing techniques.
As shown in
Still referring to
The optical system may be configured to detect the position of the encoder track with respect to the read-head. As an example, the optical system may include one or more light emitters, one or more light receiver and other optical components, such as lenses, mirrors and masks. The light emitter(s) and receiver(s) may be arranged to detect features on the encoder track. For instance, the receiver(s) may detect the features based on reflection of the light produced by the emitter(s).
As another example, the optical system may include one or more light source(s), one or more digital camera(s) and other optical components, such as lenses, mirrors and masks. The light source(s) may be arranged to provide illumination of the encoder track in the field of view of the digital camera(s). The digital camera(s) may be arranged to take periodically pictures (images) of the encoder track. The pictures may be processed by the encoder read-head and/or externally to the encoder read-head, such as at the controller 54 for example, to determine the location of the encoder track with respect to the encoder read-head.
In another example embodiment, as depicted diagrammatically in
The encoder track 44 may include features that may be utilized to sense location of the encoder track. As an example, the features may form an incremental track and, in some case, the incremental track may be complemented by an absolute track. As another example, the features may form a pattern that may be decoded using image processing techniques.
The read-head 50a may include an enclosure 56a, a window 62 and an optical system including 58, 60. The read-head 50a may further include other components, such as electronics, to control the read-head, process the data and facilitate communication.
As depicted in
The optical system may be configured to detect the position of the encoder track 44 with respect to the read-head. As an example, the optical system may include one or more light emitter(s), one or more light receiver(s) and other optical components, such as lenses, mirrors and masks. The light emitter(s) and receiver(s) may be arranged to detect features on the encoder track. For instance, the receiver(s) may detect the features based on reflection of the light produced by the emitter(s). The light source(s) may be arranged to provide illumination of the encoder track in the field of view of the digital camera(s). The digital camera(s) 60 may be arranged to take periodically pictures (images) of the encoder track. The pictures may be processed by the encoder read-head to determine the location of the encoder track with respect to the encoder read-head.
The read-head 50a may be attached to the housing 38 of the direct-drive module so that the window 62 of the read-head is sealed with respect to the housing 38 of the direct-drive module around the aperture 48 of the direct drive module, thus separating the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module from the environment outside of the direct-drive module. As a result, the components inside of the enclosure of the read-head 50a are not exposed to the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module.
As illustrated in
It should be noted that in all of the example configurations of
Alternatively, the read-head may be coupled to the housing of the direct-drive module in a movable manner to allow for adjustment of the sensor with respect to the encoder track. The adjustment may include the distance between the read-head and the encoder track as well as orientation (for instance, pitch, roll and yaw) of the read-head with respect to the encoder track. The window of the sensor may be sealed to the housing of the direct-drive module by an O-ring, a bellows, a flexure or any other suitable seal providing the sensor with sufficient movement while maintaining a seal.
In yet another example embodiment, as depicted diagrammatically in
The encoder track may include features that may be utilized to sense location of the encoder track. As an example, the features may form an incremental track and, in some case, the incremental track may be complemented by an absolute track. As another example, the features may form a pattern that may be decoded using image processing techniques.
The read-head may include an enclosure, a window and an optical system. The read-head may further include other components, such as electronics, to control the read-head, process the data and facilitate communication.
As depicted in
The optical system may be configured to detect the position of the encoder track with respect to the read-head. As an example, the optical system may include one or more light emitters, one or more light receivers and other optical components, such as lenses, mirrors and masks. The light emitter(s) and receiver(s) may be arranged to detect features on the encoder track. For instance, the receiver(s) may detect the features based on reflection of the light produced by the emitter(s).
As another example, the optical system may include one or more light source(s), one or more digital camera(s) and other optical components, such as lenses, mirrors and masks. The light source(s) may be arranged to provide illumination of the encoder track in the field of view of the digital camera(s). The digital camera(s) may be arranged to take periodically pictures (images) of the encoder track. The pictures may be processed by the encoder read-head to determine the location of the encoder track with respect to the encoder read-head.
The read-head may be attached to the housing of the direct-drive module so that the enclosure of the sensor is sealed with respect to the housing of the direct-drive module around the aperture of the direct drive module and around the window of the read-head, thus separating the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module from the environment outside of the direct-drive module and, furthermore, separating the components inside of the enclosure of the read-head from the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module. This prevents exposure of the components inside of the enclosure of the read-head from the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module.
As illustrated in
It should be noted that the window in the example of
Alternatively, the read-head may be coupled to the housing of the direct-drive module in a movable manner to allow for adjustment of the read-head with respect to the encoder track. The adjustment may include the distance between the read-head and the encoder track as well as orientation (for instance, pitch, roll and yaw) of the read-head with respect to the encoder track. The enclosure of the read-head may be sealed to the housing of the direct-drive module by an O-ring, a bellows, a flexure or any other suitable seal providing the read-head with sufficient movement while maintaining a seal.
In yet another example embodiment, as depicted diagrammatically in
The encoder track may include features that may be utilized to sense location of the encoder track. As an example, the features may form an incremental track and, in some case, the incremental track may be complemented by an absolute track. As another example, the features may form a pattern that may be decoded using image processing techniques.
The read-head may include a first enclosure, a window and an optical system. The sensor may further include other components, such as electronics, to control the read-head, process the data and facilitate communication.
As depicted in
The window may be made of a substantially transparent material, such as glass or acrylic. Alternatively, as explained with respect to the examples for
The window may be sealed to the first enclosure of the read-head, for instance, using an O-ring or a bonded joint. If desired, the seal may be accomplished by the addition of a feature to the window, such as a flange. Alternatively, the seal may be accomplished by using features available on commercially available read-heads.
The optical system may be configured to detect the position of the encoder track with respect to the read-head. As an example, the optical system may include one or more light emitters, one or more light receivers and other optical components, such as lenses, mirrors and masks. The light emitter(s) and receiver(s) may be arranged to detect features on the encoder track. For instance, the receiver(s) may detect the features based on reflection of the light produced by the emitter(s).
As another example, the optical system may include one or more light source(s), one or more digital camera(s) and other optical components, such as lenses, mirrors and masks. The light source(s) may be arranged to provide illumination of the encoder track in the field of view of the digital camera(s). The digital camera(s) may be arranged to take periodically pictures (images) of the encoder track. The pictures may be processed by the encoder read-head to determine the location of the encoder track with respect to the encoder read-head.
The read-head may be attached to the housing of the direct-drive module so that the second enclosure is sealed with respect to the housing of the direct-drive module. The second enclosure is then sealed to the window of the read-head, thus separating the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module from the environment outside of the direct-drive module and, furthermore, separating the components inside of the second enclosure from the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module. This prevents exposure of the components inside of the second enclosure from the vacuum or other non-atmospheric environment inside of the housing of the direct-drive module.
Alternatively, the second enclosure may be coupled to the housing of the direct-drive module in a movable manner to allow for adjustment of the read-head with respect to the encoder track. The adjustment may include the distance between the read-head and the encoder track as well as orientation (for instance, pitch, roll and yaw) of the read-head with respect to the encoder track. The second enclosure may be sealed to the housing of the direct-drive module by an O-ring, a bellows, a flexure or any other suitable seal providing the read-head with sufficient movement while maintaining a seal.
Although the above example embodiments show the read-head looking radially inward, it may be arranged to look radially out, for example pointing at an internal cylindrical surface of the disk, or to look axially up or down, for example, pointing at one of the flat faces of the disk.
Features as described herein may be used to provide a drive arrangement (such as a robot drive arrangement) with an optical encoder where the disk of the optical encoder is in one environment (such as vacuum environment) and the components of the read-head of the optical encoder, including its optical system and control electronics, are in another environment (such as an atmospheric environment) and there is a substantially transparent barrier between the disk and the components of the read-head.
Although the example embodiments show the read-head looking radially inward, the sensor may be arranged to look radially out on an internal cylindrical surface of the disk, or to look axially (up or down) on one of the flat faces of the disk 44. This arrangement may be extended to linear applications including, for example, the example linear robots described in U.S. patent publication Nos. 2015/0214086 A1, 2016/0229296 A1 and 2017/0036358 A1 which are hereby incorporated by reference in their entireties.
An example apparatus comprises a frame, where the frame is configured to be attached to a housing of a motor assembly proximate an aperture which extends through the housing; a position sensor connected to the frame, where the position sensor comprises a camera; and an environment separation barrier configured to be connected to the housing at the aperture, where the environment separation barrier is at least partially transparent and located relative to the camera to allow the camera to view an image inside the housing through the environment separation barrier and the aperture. The environment separation barrier may be connected directly to the housing at the aperture or may be indirectly connected to the housing, such as via the frame of the apparatus, but the environment separation barrier forms at least part of the environment closure of the aperture through the housing while also providing an optical path.
The apparatus may comprise a seal configured to be located directly between the environment separation barrier and the housing of the motor assembly. The apparatus may comprise a first seal connected directly between the environment separation barrier and the frame. The apparatus may comprise a second seal configured to be located directly between the frame and the housing of the motor assembly. The apparatus may comprise a seal and a barrier holder configured to press the environment separation barrier against the seal, where the barrier holder is configured to be pressed by the frame towards the aperture. The apparatus may comprise a connection between the barrier holder and the frame which is resilient to allow the barrier holder to move relative to the frame, and where the connection biases the barrier holder towards the aperture when the frame is connected to the housing. The environment separation barrier may comprise a transparent window which directly contacts the frame and is configured to be pressed by the frame towards the aperture when the frame is connected to the housing. The apparatus may comprise the housing, a rotor inside the housing having a position reference member configured to be imaged by the camera and at least one seal, where the frame is connected to the housing with the at least one seal and the environment separation barrier sealing the aperture to separate a first environmental area inside the housing from a second environmental area outside the housing. The frame may not be exposed to the first environmental area inside the housing.
An example method may comprise providing a read-head comprising a frame and a camera connected to the frame; connecting the read-head to a housing of a motor assembly, where the frame of the read-head is connected to the housing proximate an aperture which extends through the housing; and locating an environment separation barrier at the aperture to separate a first environmental area inside the housing from a second environmental area in which the camera is located, where the environment separation barrier is at least partially transparent and located relative to the camera to allow the camera to view an image inside the housing through the environment separation barrier and the aperture.
The method may comprise locating a seal being directly between the environment separation barrier and the housing of the motor assembly. The method may comprise connecting a first seal directly between the environment separation barrier and the frame. The method may comprise locating a second seal directly between the frame and the housing of the motor assembly. The method may comprise a barrier holder biasing the environment separation barrier against a seal, where the barrier holder is pressed by the frame towards the aperture. The method may comprise proving a connection between the barrier holder and the frame which is resilient to allow the barrier holder to move relative to the frame, and where the connection biases the barrier holder towards the aperture when the frame is connected to the housing. The environment separation barrier may comprise a transparent window which directly contacts the frame and is pressed by the frame towards the aperture when the frame is connected to the housing. The method may comprise a rotor is inside the housing and a position reference member configured to be imaged by the camera, where the frame is connected to the housing with at least one seal and the environment separation barrier to seal the aperture to separate a first environmental area inside the housing from a second environmental area outside the housing. The frame may not be exposed to the first environmental area inside the housing.
An example method may comprise illuminating a reference member located inside a housing of a motor assembly by a light emitter of a sensor, where the sensor is located outside of the housing; viewing an image of the reference member by a camera of the sensor, where the camera is located outside of the housing, where the image is viewed by the camera though an aperture in the housing and through a transparent environment separation barrier located at the aperture, where the transparent environment separation barrier seals a first environment inside the housing from a second environment in which the sensor is located, and where the camera is located outside of the first environment and the transparent environment separation barrier allows the camera to view the image coming from inside the housing while the camera is outside of the first environment.
The example of
An example embodiment may be provided in an apparatus comprising a frame, where the frame is configured to be attached to a housing of a motor assembly proximate an aperture which extends through the housing; at least one light emitter connected to the frame; an array of optical sensors connected to the frame; and an environment separation barrier configured to be connected to the housing at the aperture, where the environment separation barrier is at least partially transparent and located relative to the array of optical sensors to allow the array of optical sensors to view an image inside the housing through the environment separation barrier and the aperture.
An example method may be provided comprising providing a read-head comprising a frame, at least one light emitter connected to the frame, and an array of optical sensors connected to the frame; connecting the read-head to a housing of a motor assembly, where the frame of the read-head is connected to the housing proximate an aperture which extends through the housing; and locating an environment separation barrier at the aperture to separate a first environmental area inside the housing from a second environmental area in which the array of optical sensors is located, where the environment separation barrier is at least partially transparent and located relative to the array of optical sensors to allow the array of optical sensors to view an image inside the housing through the environment separation barrier and the aperture.
An example method may be provided comprising illuminating a reference member located inside a housing of a motor assembly by at least one light emitter of a read-head, where the read-head is located At least partially outside of the housing; viewing at least one image of the reference member by an array of optical sensors of the read-head, where the array of optical sensors is located at least partially outside of the housing, where the at least one image is viewed by the array of optical sensors though an aperture in the housing and through a transparent environment separation barrier located at the aperture, where the transparent environment separation barrier seals a first environment inside the housing from a second environment in which the array of optical sensors is located, and where the array of optical sensors is located outside of the first environment and the transparent environment separation barrier allows the array of optical sensors to view the at least one image coming from inside the housing while the camera is outside of the first environment.
It should be understood that the foregoing description is only illustrative. Various alternatives and modifications can be devised by those skilled in the art. For example, features recited in the various dependent claims could be combined with each other in any suitable combination(s). In addition, features from different embodiments described above could be selectively combined into a new embodiment. Accordingly, the description is intended to embrace all such alternatives, modifications and variances which fall within the scope of the appended claims.
This application is a continuation of U.S. patent application Ser. No. 17/232,245 filed Apr. 16, 2021 which is a divisional application of U.S. application Ser. No. 16/593,050 filed Oct. 4, 2019, which is a divisional application of U.S. application Ser. No. 15/465,101 filed Mar. 21, 2017, now U.S. Pat. No. 10,476,354, which claims priority under 35 USC 119(e) to U.S. provisional patent application No. 62/310,989 filed Mar. 21, 2016 which is hereby incorporated by reference in its entirety, and is a continuation-in-part of U.S. patent application Ser. No. 13/744,900 filed Jan. 18, 2013 which is a divisional patent application of application Ser. No. 13/618,315 filed Sep. 14, 2012, which claims priority under 35 USC 119(e) on Provisional Patent Application No. 61/627,030 filed Sep. 16, 2011 and Provisional Patent Application No. 61/683,297 filed Aug. 15, 2012, which are hereby incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
3602748 | Locke | Aug 1971 | A |
4414523 | Pieters | Nov 1983 | A |
4486677 | Yamamoto et al. | Dec 1984 | A |
4748355 | Anderson | May 1988 | A |
4749898 | Suzuki et al. | Jun 1988 | A |
4952830 | Shirakawa | Aug 1990 | A |
5113102 | Gilmore | May 1992 | A |
5291087 | Pollick et al. | Mar 1994 | A |
5394043 | Hsia | Feb 1995 | A |
5397212 | Watanabe et al. | Mar 1995 | A |
5608277 | Emery et al. | Mar 1997 | A |
5720590 | Hofmeister | Feb 1998 | A |
5813823 | Hofmeister | Sep 1998 | A |
5899658 | Hofmeister | May 1999 | A |
5914548 | Watanabe et al. | Jun 1999 | A |
6150747 | Smith et al. | Nov 2000 | A |
6274962 | Kliman | Aug 2001 | B1 |
6355999 | Kichiji et al. | Mar 2002 | B1 |
6653758 | Tsuneyoshi et al. | Nov 2003 | B2 |
6664535 | Nahum | Dec 2003 | B1 |
6700249 | Botos | Mar 2004 | B1 |
6709521 | Hiroki | Mar 2004 | B1 |
6710562 | Kalb et al. | Mar 2004 | B1 |
6888284 | Eggers et al. | May 2005 | B2 |
6960758 | Tenca et al. | Nov 2005 | B2 |
7011554 | Taniguchi et al. | Mar 2006 | B2 |
7336012 | Tanaka | Feb 2008 | B2 |
7847442 | Rohner et al. | Dec 2010 | B2 |
7898135 | Flynn | Mar 2011 | B2 |
8063517 | Bott et al. | Nov 2011 | B2 |
8102090 | Takeuchi | Jan 2012 | B2 |
8237391 | Krupyshev et al. | Aug 2012 | B2 |
8283813 | Gilchrist et al. | Oct 2012 | B2 |
8716909 | Hosek et al. | May 2014 | B2 |
9202733 | Hosek | Dec 2015 | B2 |
9230840 | Hiroki | Jan 2016 | B2 |
20030164552 | Tamura et al. | Sep 2003 | A1 |
20040001750 | Kremerman | Jan 2004 | A1 |
20050286993 | Minami et al. | Dec 2005 | A1 |
20060290226 | Ohkawa | Dec 2006 | A1 |
20070280813 | Nakamura et al. | Dec 2007 | A1 |
20080156973 | Wong et al. | Jul 2008 | A1 |
20080315692 | Beetz | Dec 2008 | A1 |
20090243413 | Gilchrist | Oct 2009 | A1 |
20110156514 | Watanabe et al. | Jun 2011 | A1 |
20110266900 | Gaumer | Nov 2011 | A1 |
20130028700 | Gilchrist et al. | Jan 2013 | A1 |
20130071218 | Hosek | Mar 2013 | A1 |
20130121798 | Hosek | May 2013 | A1 |
20140077637 | Hosek et al. | Mar 2014 | A1 |
20150214086 | Hofmeister et al. | Jul 2015 | A1 |
20150303764 | Hosek et al. | Oct 2015 | A1 |
20150311639 | Neureuter | Oct 2015 | A1 |
20160190728 | VanZuilen | Jun 2016 | A1 |
20160229296 | Hosek et al. | Aug 2016 | A1 |
20170036358 | Hosek et al. | Feb 2017 | A1 |
Number | Date | Country |
---|---|---|
1479077 | Mar 2004 | CN |
1531170 | Sep 2004 | CN |
2669461 | Jan 2005 | CN |
102106062 | Jun 2011 | CN |
103930363 | Jul 2014 | CN |
0111764 | Jun 1984 | EP |
0385203 | Sep 1990 | EP |
59-096843 | Jun 1984 | JP |
S-6132663 | Sep 1986 | JP |
06-042602 | Feb 1994 | JP |
8003191 | Jan 1996 | JP |
H-08-19985 | Jan 1996 | JP |
08-066880 | Mar 1996 | JP |
10-128692 | May 1998 | JP |
2002534282 | Oct 2002 | JP |
2003-220586 | Aug 2003 | JP |
2004-146714 | May 2004 | JP |
2008167589 | Jul 2008 | JP |
2009038908 | Feb 2009 | JP |
2009521196 | May 2009 | JP |
2009271076 | Nov 2009 | JP |
2009-303331 | Dec 2009 | JP |
2010040947 | Feb 2010 | JP |
2010050114 | Mar 2010 | JP |
2010207938 | Sep 2010 | JP |
2011139086 | Jul 2011 | JP |
2013513929 | Apr 2013 | JP |
2014123673 | Jul 2014 | JP |
2014527314 | Oct 2014 | JP |
2006013371 | Jan 2016 | JP |
20010092771 | Oct 2001 | KR |
WO-9103095 | Mar 1991 | WO |
WO-2006114390 | Nov 2006 | WO |
WO-2006124934 | Nov 2006 | WO |
WO-2006130954 | Dec 2006 | WO |
WO-2009003196 | Dec 2008 | WO |
WO-2011075345 | Jun 2011 | WO |
WO-2013040401 | Mar 2013 | WO |
Entry |
---|
“A Passive Rotor Transverse Flux Motor”, Popan et al., Workshop on Variable Reluctance Electrical Machines, Technical University of Cluj-Napoca, Sep. 17, 2002, 4 pages. |
Number | Date | Country | |
---|---|---|---|
20230198346 A1 | Jun 2023 | US |
Number | Date | Country | |
---|---|---|---|
62310989 | Mar 2016 | US | |
61683297 | Aug 2012 | US | |
61627030 | Sep 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 16593050 | Oct 2019 | US |
Child | 17232245 | US | |
Parent | 15465101 | Mar 2017 | US |
Child | 16593050 | US | |
Parent | 13618315 | Sep 2012 | US |
Child | 13744900 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 17232245 | Apr 2021 | US |
Child | 17963616 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 13744900 | Jan 2013 | US |
Child | 15465101 | US |