The present invention relates to the display of images captured by a camera via a video display.
Various circumstances exist where it is desirable to capture images, such as still or moving images, via an image capture device and then display those images via a separate display. One problem with the display of images via video displays is that changes in the orientation of the images may be incompatible with the display displaying the images.
For example, a display which is used to display images in a store for advertising purposes may be mounted to a wall in landscape orientation. Various images might then be shown on the display. If the images are also in landscape orientation, they will be displayed properly via the display. However, if some of the images are arranged in portrait orientation, problems arise.
First, the image could be rotated so that its orientation matches that of the display. However, this would result in viewers seeing the image sideways or lying on its side.
Second, if the image is displayed in its proper orientation, the image has to be scaled for display by the display, such as by reducing the dimensions of the image so that the full height of the image can be displayed via the display. The image then has a width dimension which is less than that of the display, so that bars or other elements must be displayed on either side of the displayed image to fill the space between the sides of the image and the sides of the display. In addition, the size of the displayed image is reduced, making it less visible.
The present invention comprises systems and methods for addressing these and other problems.
Aspects of the invention comprise robotic cameras, robotic displays, systems including one or more robotic cameras and one or more robotic displays, and methods of capturing and displaying images.
In one embodiment of the invention, a method of displaying images captured by an image capture device via at least one image display device comprises the steps of: (a) supporting the image capture device via a robotic camera mount which is configured to rotate the image capture device about at least one axis; (b) supporting the image display device via a robotic display mount which is configured to rotate the image capture device about at least one axis; (c) moving the robotic camera mount to move the image capture device to a first image capture device orientation; (d) capturing one or more first images in a first image orientation with the image capture device; (e) moving the robotic display mount to move the image display device to a first image display device orientation; (f) displaying the one or more first images in a first display orientation, which first display orientation is the same as the first image orientation; (g) moving the robotic camera mount to move the image capture device to a second image capture device orientation; (h) capturing one or more second images in a second image orientation; (i) moving the robotic display mount to move the image display device to a second image display device orientation; and (j) displaying the one or more second images in a in a second display orientation, which second display orientation is the same as the second image orientation.
In one embodiment of the invention, a system for displaying images captured by an image capture device comprises: a controller; a robotic camera comprising a robotic camera mount and an image capture device, the robotic camera mount comprising a head which is rotatable about at least one axis via actuation of at least one motor, and the image capture device mounted to the head for movement by the robotic camera mount; a robotic display comprising a robotic display mount and an image display device, the robotic camera mount comprising a head which is rotatable about at least one axis via actuation of at least one motor, and the image display device mounted to the hear for movement by the robotic display mount; the controller configured to generate control instructions for causing the robotic camera mount to move the image capture device to a first image capture device orientation in which the image capture device captures images in a first image orientation and for causing the robotic display mount to move the image display device to a first image display device orientation for displaying images in a first display orientation, which first display orientation is the same as the first image orientation when the images in the first image orientation are displayed by the image display device; and for causing the robotic camera mount to move the image capture device to a second image capture device orientation in which the image capture device captures images in a second image orientation and for causing the robotic display mount to move the image display device to a second image display device orientation for displaying images in a second display orientation, which second display orientation is the same as the second image orientation when the images in the second image orientation are displayed by the image display device.
In one embodiment, the robotic camera mount is configured to move the camera and robotic display mount is configured to move the display, linearly along each of three orthogonal axis, or combinations thereof. In another embodiment, the mounts are configured to move the camera and display in six degrees of freedom.
In one embodiment, the first image orientation is a portrait orientation and the first display orientation is a portrait orientation and the second image orientation is a landscape orientation and the second display orientation is a landscape orientation.
In one embodiment, the first image capture device orientation is a portrait orientation, the first image display device orientation is a portrait orientation, the second image capture device orientation is a landscape orientation and the second image display device orientation is a landscape orientation.
Further objects, features, and advantages of the present invention over the prior art will become apparent from the detailed description of the drawings which follows, when considered with the attached figures.
In the following description, numerous specific details are set forth in order to provide a more thorough description of the present invention. It will be apparent, however, to one skilled in the art, that the present invention may be practiced without these specific details. In other instances, well-known features have not been described in detail so as not to obscure the invention.
In general, the invention comprises one or more robotically-controlled cameras and one or more robotically-controlled displays, such as video displays. In a preferred embodiment of the invention, the robotically-controlled displays are configured to be oriented in a manner which corresponds to the images captured by the one or more robotically-controlled cameras. As one example, when a robotically-controlled camera is positioned to capture one or more images in landscape orientation, the one or more displays are similarly (and preferably, synchronously) moved to a landscape orientation to display the “landscape” captured images, and when the robotically-controlled camera is moved to a position to capture one or more images in portrait orientation, the one or more displays are similarly (and preferably, synchronously) moved to a portrait orientation to display the “portrait” captured images.
The robotic camera 22 is moveable, thus permitting the position of an associated camera C to be changed. As detailed below, in a preferred embodiment, the robotic camera 22 can be used to change the position of the camera C about at least one axis so that it can capture images in “portrait” and “landscape” orientations, and at other desired orientations therebetween. In a most preferred embodiment, the robotic camera 22 can be used to change the position of the camera C freely in three-dimensional space.
In a preferred embodiment, the robotic camera 22 is referred to as “robotic” because it is a device which can change positions in an automated fashion. In particular, the robotic camera 22 is preferably capable of multiple movements without manual intervention (i.e. can move between various positions based upon a sequence of instructions without each movement being prompted by individual user input).
Preferably, the robotic camera 22 comprises a robotic mount 30 which is movable so that an associated camera C is linearly moveable in three (3) directions or along three (3) axis which are orthogonal to one another, and/or in combinations of those directions. For example, as illustrated in
In one embodiment, the robotic mount 30 comprises a base 32 and a movable support or mount 34. The base 32 is configured to support the movable mount 34, and the movable mount 34 is preferably moveable relative to the base 32, thus permitting an associated camera C (which is connected to or supported by the mount 24), to be moveable.
Referring to
In a preferred embodiment, the moveable support 34 is positioned between the base 32 and the camera C. This support is preferably moveable in at least three (3), and preferably six (6) degrees of freedom, and is thus moveable in at least two (2), and more preferably three (3), dimensions or dimensional space. As indicated above, the robotic mount 30 preferably at least allows the camera C to be rotated. In a preferred embodiment, however, movement is permitted linearly relative to each of three generally orthogonal axis (as well as combinations thereof), as well as rotationally around each axis. As disclosed below, the movable support 34 may permit redundant movement in one or more directions. For example, the movable support 34 may include two or more elements which permit it (and thus an object connected thereto, such as a camera C) to be moved in the x, y and/or z direction (three degrees of freedom), and to rotate about the x, y and/or z axis (3 additional degrees of freedom), or various combinations thereof.
As illustrated, in one embodiment, the mount includes a main support 42. In one embodiment, the main support 42 is mounted for rotation relative to the base 32, i.e. about the y-axis as illustrated in
In one embodiment, a lower arm 44 is rotatably mounted to the main support 42. As illustrated, the main support 42 has a first portion mounted to the base 32 and a second portion to which the lower arm 44 is mounted. In a preferred embodiment, the lower arm 44 is rotatably mounted to the main support 42 about a shaft or other mount. In the configuration illustrated, the lower arm 44 is mounted for rotation about a z-axis (i.e. an axis which is generally perpendicular to the axis about which the base 30 rotates).
As further illustrated, an upper arm 46 is rotatably mounted to the lower arm 44. In one embodiment, a first or distal portion of the lower arm 44 is mounted to the main support 42, and the upper arm 46 is mounted to a top or proximal portion of the lower arm 44. In one embodiment, the upper arm 46 is also mounted for rotation about the z-axis.
In one embodiment, a head 48 is located at a distal portion of the upper arm 36. Preferably, the camera C is mounted to the movable mount 34 via the head 48. In one embodiment, the head 48 is mounted for rotation relative to the upper arm 46 (and thus the remainder of the movable mount 34). In one configuration, the head 48 may be configured to both swivel relative to the upper arm 36 (e.g. rotate about the z-axis as illustrated in
The various portions of the movable support 34 may be connected to one another (and to the base 32) in a variety of fashions. For example, the various portions may be connected to one another via a shaft and bearing mount, where the shaft is connected to one component and engages one or more bearings supported by the other component, such that the shaft may move relative to the bearing(s), thus permitting the components to move relative to one another. The portions of the movable support 34 might be mounted to one another in other fashions, however, such as by hinged mounting or the like.
Preferably, the movable support 34 includes means for moving the one or more portions thereof, and thus the camera C connected thereto. As illustrated, the movable support 34 may include one or more motors for moving the components thereof. The motors may be electrical motors. In other embodiments, hydraulics or other means may be utilized to move one or more of the components of the movable support 34. For example, a hydraulic arm might be utilized to move the upper arm 46 relative to the lower arm 44 in an up and down direction.
Of course, the robotic camera mount 30 might have various other configurations. For example, while the robotic camera mount 30 described above is redundant in its capacity to move in certain directions (i.e. the upper and lower arms 46, 44 are both configured to move about the z axis and thus redundantly in the x and y directions), the robotic camera mount 30 could be configured in other fashions (such as by having only a single portion configured to move in each direction). It will also be appreciated that the number of members or elements which the movable mount 24 comprises may vary. For example, the robotic camera mount 30 might comprise a base and a head which is mounted to the base, such as via a swivel, permitting the head to be moved in at least two dimensions. Various configurations of members may also be utilized to allow movement in various directions. For example, aside from swivels or the rotating connections of the movable mount illustrated in
As another example, the entire robotic camera mount 30 may be movable. For example, the robotic camera mount 30 might be movable in one or more directions via wheels riding on a track (not shown) or otherwise, including where the wheels may rotate, thus allowing the robotic camera mount 30 to rotate or spin, or might be configured to move in one or more directions by walking (such as by including one or more legs).
As indicated, in a preferred embodiment, the robotic camera mount 30 is configured to move at least one camera C. In one embodiment, the camera C is directly attached to the movable support 34, such as to the head 48. In general, the robotic camera mount 30 has a portion (such as the head 48 or an element connected thereto) which is moveable in the manner described above (as indicated above, in one embodiment, movement of the various portions of the movable support 34 allow the head 48 to be moved in three (3) generally orthogonal directions and combinations thereof, as well as rotationally about those directions) and is thus configured to move an associated camera C.
The camera C might be of various types. In a preferred embodiment, the camera C is configured to capture electronic images, such as via a CCD, MOS, CMOS or other types of image capture devices. The images may be single frame images or multi-frame images or “videos.” In the embodiment illustrated, a single camera C is connected to a single robotic camera mount 30. In other embodiments, multiple cameras might be mounted to a single mount 30. While the term “camera” is used herein, the images might be captured by any number of image capture devices (which may or may not be called cameras) and may or may not include features such as fixed or movable lenses, flashes and other features.
In one embodiment, the camera C is configured to capture images which have an aspect ratio wherein the height and width of the image are not equal, whereby, depending on the orientation of the camera C, the captured images have one orientation which may be referred to as a “portrait” orientation (where the largest dimension of the image, when the image is viewed in its normal position, is generally oriented vertically) and a “landscape” orientation (where the largest dimension of the image, when the image is viewed in its normal position, is generally oriented horizontally).
As indicated, the system 20 also comprises one or more robotic displays 24. Each robotic display 24 preferably comprises one or more displays D connected or mounted to a robotic display mount 50. In one embodiment, the robotic display mount 50 is similar to the robotic camera mount 30 described above, such as comprising a base 32 and a movable mount 34. As described above, the movable mount 34 may have various configurations, including one where the movable mount 34 has a main support 42, lower arm 44, and an upper arm and head (not shown). The one or more displays D may be connected or mounted to the head for movement by the movable support 34.
The one or more displays D (or “image display devices”) may be of various types. In one embodiment, the display D may comprise a video display such as a plasma, LED, OLED, LCD, DLP or other type. The display might also comprise a projector for causing one or more images to be displayed on a remote surface. In one embodiment, the display D is a generally flat-panel display. The display D may have various sizes. In one embodiment, the display D has a 4:3, 16:9 or other aspect ratio where a width and height of the display screen are not equal, whereby the display D has one orientation which may be referred to as a “portrait” orientation (where the largest dimension of the screen is generally orientated vertically) and a “landscape” orientation (where the largest dimension of the screen is generally oriented horizontally).
In one embodiment, the system 20 includes means for controlling the one or more robotic cameras 22 and robotic displays 24. In one embodiment, the system 20 includes a controller 60. In a preferred embodiment, the controller 60 may comprise or include a computing device. Various instructions may be provided from the controller 60 to the robotic camera 22 and robotic display 24, causing them to move. For example, a user might provide an input to the controller 60, which input is a request to move the camera C or display D from a first to a second position. The controller 60 may generate one more signals or instructions which are transmitted to the robotic camera 22 and/or robotic display 24 for causing them to so move. The signal might comprise opening of a switch which allows electricity to flow to one or more motors of the robotic camera 22 and/or robotic display 24 for a predetermined period time which is necessary for the motor to affect the desired movement. In another embodiment, the signal might comprise an instruction which is received by sub-controller of the mount, which sub-controller then causes the mount to move as desired.
In one embodiment, the main controller 60 includes one or more user input devices 64, such as a mouse, keyboard, touch-screen or the like, via which the user may provide input. The main controller 60 might generate one or more graphical user interfaces for display on a control display 62 and the user may interact with the interface to provide input (such as by inputting text, clicking boxes, etc.).
In one embodiment, control signals or instructions may that are generated or otherwise output by the main controller 60 may be transmitted to a robotic cable mount sub-controller 68. Such a sub-controller 68 might, for example, be a controller which is located adjacent to the robotic camera 22 or robotic display 24 or within a housing or portion thereof. The sub-controller 68 may process the control instruction and use them to operate the various portions of the robotic camera 22 or robotic display 24, such as one or more motors M. For example, the sub-controller 68 may parse instructions from the main controller 60 so as to individually control each motor M in a manner which effectuates the main control instructions.
The main controller 60 might communicate with each robotic camera 22 and robotic display 24 via wired or wireless communication links. For example, main controller 60 might transmit signals via a RS-232 communication link including a wired pathway to the sub-controller 60 of the robotic camera 22 or robotic display 24. Alternatively, the main controller 60 and the sub-controllers 68 might both include wireless transceivers. In this manner, the main controller 60 may transmit instructions to the robotic camera 22 and robotic display 24 wirelessly.
Of course, other control configurations are possible. For example, the main controller 60 may comprise a server. One or more users may communicate with the server, such as from user stations (like desktop or laptop computers) or via other devices such as mobile devices including phones or tablets. In one embodiment, the server may be configured as a webserver where users may interface with the server via a web-page. In other configurations, the controller might be a mobile communication device such as an Apple iPhone® which is executing a control application.
Aspects of methods of moving the one or more cameras and one or more displays, including via a system 20 such as described above, will now be described.
In one embodiment of a method, the position of a camera C, including its orientation, is controlled, such as via a robotic camera mount 30. The position of a display D, including its orientation, is also controlled, such as via a robotic display mount 50. In one embodiment, the orientation of the video display D is controlled so that it is the same as (e.g. matches) that of the camera C. In particular, in one embodiment, the camera C may be oriented to capture images in a landscape orientation, a portrait orientation, or orientations therebetween. The orientation of the display D is controlled so that it is the same as the orientation of the camera C (or at least the orientation of the images captured thereby).
One example of use of the invention will be described with reference to
As illustrated in
At the same time, the robotic display D is controlled to cause the display D to display the images in portrait orientation (relative to the aspect ratio of the display), e.g. by moving the robotic display D to the corresponding display position.
As illustrated in
As indicated above, synchronous control of the camera C and display D may be accomplished by a control system 20. The control system 20 may send commands to both the robotic camera 22 and robotic display 24 to cause them to move synchronously.
In one embodiment, not only might the orientation of the camera C and display D be changed, but their positions may also be changed. In one embodiment, their positions may also be synchronized. For example, as illustrated in
As indicated above, the system 20 might include more than one robotic display 24 (such as for displaying the same images from one camera C), or might include more than one robotic camera 22 and more than one robotic display 24.
In another embodiment, the robotic display D may be controlled based on data obtained from the camera C that corresponds to the images taken by the camera C. For example, the camera C may comprise one or more sensors such as gyroscopes, accelerometers, and the like to detect a position and orientation of the camera. The position and orientation data of the camera may be associated with each image captured by the camera C. Note that the position and orientation data may be obtained by the camera C in other manners such as through image recognition software, from the control instructions sent to the robotic camera 22, or the like.
Accordingly, as the robotic camera 22 receives control instructions and the camera C obtains images, the position data and the images may be correlated and stored and/or transmitted to the robotic display D. The robotic display 24 may then be configured to display the images and rotate the display 24 based on the position and orientation data that corresponds with displayed images. In this manner, the robotic display 24 may not display the images synchronously with the movement of the robot camera 22 but may also display the images and the corresponding orientations at a later time.
It should be noted that while the orientations of portrait and landscape have been used for ease of explanation, the robotic camera 22 and the robotic display 24 may be configured to move synchronously (or at least move synchronously to orientation and position data corresponding to displayed images) about multiple axes simultaneously. For example, the robotic camera 22 may move the camera C above the object O requiring rotation about the z axis shown in
Further, the robotic display 24 may be configured to rotate synchronously and inversely to the robotic camera 22 (or to position data corresponding to obtained images from the camera C). For instance, in the example just described, the robotic camera 22 may move and rotate so the camera C obtains a view from above the object O. The robotic display 24, however, may move in an inverse manner (e.g. moving downward and rotating in an opposite direction as compared to the robotic camera) to allow the view the object O that is now displayed on the display D to be from the same position and orientation of the camera.
The invention has various uses and benefits. The present invention solves the problems noted in the Background above by moving the display into a position in which it is oriented to match the orientation of the image which is to be displayed. In a preferred embodiment, the display displays images which are captured in real time via a camera.
As one implementation of use, the system 20 might be used to capture images of an object which is being worked on (car, airplane, etc.), where technicians need to view the images remotely. The technicians may need to change the orientation of the camera to gain access to the area they wish to view or to get a better viewing angle. In accordance with the invention, the associated display moves synchronously with the camera so that the captured images are always displayed in the same orientation as they are captured, making it much easier for the technicians to view the images.
As another example, a store might wish to display a good, such as a pair of shoes. In order to allow many customers to see the shoes, the store might take pictures of the shoes and display the picture of the shoes. However, the store might wish to show images of the shoes taken from various perspectives, angles and distances, thereby giving the customer more information regarding how the shoe looks from different perspectives and providing detail of various features of the shoe. In accordance with the invention, the system might be used to capture images of the shoe (including in a streaming video format) and have those images be displayed on one or more displays. As illustrated in
It will be understood that the above described arrangements of apparatus and the method there from are merely illustrative of applications of the principles of this invention and many other embodiments and modifications may be made without departing from the spirit and scope of the invention as defined in the claims.
Number | Name | Date | Kind |
---|---|---|---|
4680519 | Chand et al. | Jul 1987 | A |
4706000 | Kishi et al. | Nov 1987 | A |
4774445 | Penkar | Sep 1988 | A |
4868473 | Kato | Sep 1989 | A |
5078021 | Freywiss | Jan 1992 | A |
5255096 | Boyle | Oct 1993 | A |
5355063 | Boone et al. | Oct 1994 | A |
5413454 | Movsesian | May 1995 | A |
5448225 | Maignon et al. | Sep 1995 | A |
5496086 | Adrian et al. | Mar 1996 | A |
5596683 | Kasagami et al. | Jan 1997 | A |
5628660 | Onitsuka | May 1997 | A |
5652849 | Conway et al. | Jul 1997 | A |
5683068 | Chase et al. | Nov 1997 | A |
5708527 | Adamson et al. | Jan 1998 | A |
5724264 | Rosenberg et al. | Mar 1998 | A |
5773984 | Suyama et al. | Jun 1998 | A |
5909998 | Herbermann et al. | Jun 1999 | A |
5947429 | Sweere et al. | Sep 1999 | A |
6046711 | Kouchi | Apr 2000 | A |
6085670 | Genov | Jul 2000 | A |
6095476 | Mathis | Aug 2000 | A |
RE36978 | Moscovitch | Dec 2000 | E |
6212784 | Pittman | Apr 2001 | B1 |
6328206 | Schanz et al. | Dec 2001 | B1 |
6427823 | Ishikawa | Aug 2002 | B1 |
6437973 | Helot et al. | Aug 2002 | B1 |
6456339 | Surati et al. | Sep 2002 | B1 |
6507163 | Allen | Jan 2003 | B1 |
6655645 | Lu et al. | Dec 2003 | B1 |
6708940 | Ligertwood | Mar 2004 | B2 |
6807461 | Kneifel | Oct 2004 | B2 |
6826963 | Liu et al. | Dec 2004 | B2 |
6914622 | Smith et al. | Jul 2005 | B1 |
7022962 | Ohtomo | Apr 2006 | B2 |
7043335 | Yoshida et al. | May 2006 | B2 |
7092001 | Schulz | Aug 2006 | B2 |
7123285 | Smith et al. | Oct 2006 | B2 |
7154526 | Foote et al. | Dec 2006 | B2 |
7163249 | Schmidt et al. | Jan 2007 | B2 |
7296774 | Oh | Nov 2007 | B2 |
7414603 | Tseng | Aug 2008 | B2 |
7463821 | DiFrancesco et al. | Dec 2008 | B2 |
7500550 | Strong et al. | Mar 2009 | B2 |
7545108 | Flessas | Jun 2009 | B2 |
7559766 | Epley | Jul 2009 | B2 |
7576830 | DiFrancesco | Aug 2009 | B2 |
7682357 | Ghodoussi et al. | Mar 2010 | B2 |
7719222 | Theobald | May 2010 | B2 |
7720570 | Close et al. | May 2010 | B2 |
7899577 | Ban et al. | Mar 2011 | B2 |
8016434 | Turner et al. | Sep 2011 | B2 |
8179337 | Wilzbach et al. | May 2012 | B2 |
8302488 | Hsu et al. | Nov 2012 | B2 |
8356704 | Flessas | Jan 2013 | B2 |
8380349 | Hickman et al. | Feb 2013 | B1 |
8655429 | Kuduvalli et al. | Feb 2014 | B2 |
8825225 | Stark et al. | Sep 2014 | B1 |
8896242 | Flessas | Nov 2014 | B2 |
8944609 | Fox et al. | Feb 2015 | B2 |
8998797 | Omori | Apr 2015 | B2 |
9067322 | Keibel et al. | Jun 2015 | B2 |
9423608 | Doyle et al. | Aug 2016 | B2 |
9604361 | Gustafsson et al. | Mar 2017 | B2 |
9794533 | Flessas | Oct 2017 | B2 |
9823693 | Flessas | Nov 2017 | B2 |
10867396 | Li | Dec 2020 | B1 |
20030089267 | Ghorbel et al. | May 2003 | A1 |
20030135203 | Wang et al. | Jul 2003 | A1 |
20030144649 | Ghodoussi et al. | Jul 2003 | A1 |
20030224333 | Vastvedt | Dec 2003 | A1 |
20040066612 | Yu | Apr 2004 | A1 |
20040199290 | Stoddard | Oct 2004 | A1 |
20040202445 | DiFrancesco | Oct 2004 | A1 |
20040249507 | Yoshida et al. | Dec 2004 | A1 |
20040257021 | Chang | Dec 2004 | A1 |
20050038416 | Wang et al. | Feb 2005 | A1 |
20050110867 | Schulz | May 2005 | A1 |
20050219356 | Smith et al. | Oct 2005 | A1 |
20060061124 | Schmidt et al. | Mar 2006 | A1 |
20060074525 | Close et al. | Apr 2006 | A1 |
20060184272 | Okazaki et al. | Aug 2006 | A1 |
20070064092 | Sandberg et al. | Mar 2007 | A1 |
20070086155 | Chen et al. | Apr 2007 | A1 |
20070177339 | Flessas | Aug 2007 | A1 |
20070195271 | De Zwart et al. | Aug 2007 | A1 |
20070250213 | Poljen et al. | Oct 2007 | A1 |
20080084566 | Hessert et al. | Apr 2008 | A1 |
20080158801 | Mathews | Jul 2008 | A1 |
20090003975 | Kuduvalli et al. | Jan 2009 | A1 |
20090237873 | Flessas | Sep 2009 | A1 |
20090303447 | Turner et al. | Dec 2009 | A1 |
20100091688 | Staszewski et al. | Apr 2010 | A1 |
20100140046 | Flessas | Jun 2010 | A1 |
20100145512 | Flessas | Jun 2010 | A1 |
20110249201 | Turner et al. | Oct 2011 | A1 |
20120170000 | Imaoka et al. | Jul 2012 | A1 |
20130181901 | West | Jul 2013 | A1 |
20130199010 | Osato et al. | Aug 2013 | A1 |
20140025202 | Umeno et al. | Jan 2014 | A1 |
20140031983 | Low et al. | Jan 2014 | A1 |
20140102239 | Umeno | Apr 2014 | A1 |
20140233099 | Stark et al. | Aug 2014 | A1 |
20150015781 | Flessas | Jan 2015 | A1 |
20160024781 | Flessas | Jan 2016 | A1 |
20160065920 | Flessas | Mar 2016 | A1 |
20190394431 | Flessas | Dec 2019 | A1 |
Number | Date | Country |
---|---|---|
2 922 287 | Aug 2015 | EP |
2000-267579 | Sep 2000 | JP |
10-2004-0096097 | Nov 2004 | KR |
Entry |
---|
Written Opinion and Search Report for PCT Application No. PCT/US19/48746, dated Nov. 20, 2019, 22 pages. |
Number | Date | Country | |
---|---|---|---|
20220174220 A1 | Jun 2022 | US |