Videoconferencing terminal and method of operating the same

Information

  • Patent Grant
  • 11943563
  • Patent Number
    11,943,563
  • Date Filed
    Friday, January 17, 2020
    4 years ago
  • Date Issued
    Tuesday, March 26, 2024
    9 months ago
Abstract
A method of videoconferencing comprises displaying an image of a remote user on a display and capturing an image of a local user at a user position in front of the display. The at least one camera is located at a camera position behind the display. The method comprises modifying an image to be displayed based on the camera position of the at least one camera with respect to the display and based on the user position of the local user with respect to the display.
Description

The present invention relates to a videoconferencing terminal and method of operating the same.


Today in the business environment there is an increasing demand not to travel and face to face meetings are being replaced with alternatives such as videoconferencing. However, one problem with videoconferencing is that making eye contact with the remote user may not be possible if the camera for the videoconferencing terminal is located adjacent to display screen. In this way, the local user looks at the remote user on the display screen, but the local user will not be looking directly at the camera. This can mean the eye contact is not maintained and this can be distracting to the users and reduce the efficacy of videoconferencing as a viable alternative to face to face meetings.


One known arrangement is discussed in US2012/0257004 which discloses mounting the camera behind a transparent display screen on a mechanism for moving the position of the camera. The camera is then moved with respect to the position of the local user to maintain eye contact with the camera. A problem with this arrangement is that additional mechanical components are required to enable moving the position of the camera. This means the videoconferencing terminal is usually dedicated to a specific room because setup is time consuming and complex. Furthermore, movement of the camera during a videoconference call may distract the local user if the mechanism is actuated and makes a sound.


Another known solution is discussed in US2009/0278913 which discloses moving the displayed image of the remote user's face until it is aligned with the axis of the camera behind the screen. A problem with this arrangement is that the local user may be looking at the displayed image but still not make direct eye contact with the camera and the remote user due to parallax error.


Embodiments of the present invention aim to address the aforementioned problems.


According to a first aspect of the present invention there is a method of videoconferencing comprising: displaying an image of a remote user on a display; capturing an image of a local user at a user position in front of the display, with at least one camera being located at a camera position behind the display; and modifying an image to be displayed based on the camera position of the at least one camera with respect to the display and based on the user position of the local user with respect to the display.


Optionally the method comprises determining a position of the eyes of the local user with respect to the display.


Optionally the method comprises determining an axis of the at least one camera based on the position of the eyes of the local user.


Optionally the method comprises determining a position of the eyes of the remote user with respect to the display.


Optionally the method comprises determining an offset between the axis of the camera and the eyes of the remote user in a displayed image.


Optionally the modifying comprises translating the image to be displayed such that the displayed eyes of the remote user intersect with the axis.


Optionally the method comprises determining one or more pixel artifacts captured by the at least one camera from the display.


Optionally method comprises compensating the captured camera image to remove the determined one or more pixel artifacts.


Optionally the method comprises determining one or more occlusion artifacts from one or more display elements.


Optionally the method comprises compensating the captured camera image to remove the one or more occlusion artifacts.


Optionally the occluding display elements are out of focus.


Optionally the user position of the user and/or the camera position of the at least one camera is moveable with respect to the display.


Optionally the at least one camera is one or more of the following: an RGB camera or an infrared camera.


Optionally the display is transmissive to electromagnetic radiation.


Optionally one or more of the steps is carried out during calibration and/or one or more of the steps is carried out during operation.


According to a second aspect of the present invention there is a videoconferencing terminal comprising: a display for displaying an image of a remote user; at least one camera for capturing an image of a local user at a user position in front of the display, the camera being located at a camera position behind the display; and a controller configured to modify an image to be displayed based on the camera position of the at least one camera with respect to the display and based on the user position of the local user with respect to the display.


According to a third aspect of the present invention there is a method of videoconferencing comprising: displaying an image of a remote user on a display; capturing an image of a local user at a user position in front of the display, with at least one camera being located at a camera position behind the display; and modifying an image to be displayed based on the camera position of the at least one camera with respect to the display.


According to a fourth aspect of the present invention there is a videoconferencing terminal comprising: a display for displaying an image of a remote user; at least one camera for capturing an image of a local user at a user position in front of the display, the camera being located at a camera position behind the display; and a controller configured to modify an image to be displayed based on the camera position of the at least one camera with respect to the display.





Various other aspects and further embodiments are also described in the following detailed description and in the attached claims with reference to the accompanying drawings, in which:



FIG. 1a shows a schematic perspective view of a videoconferencing terminal;



FIG. 1b shows a schematic side view of a videoconferencing terminal;



FIG. 2 shows a schematic cross-sectional side view of a videoconferencing terminal;



FIG. 3 shows a schematic perspective view of a videoconferencing terminal;



FIGS. 4a, 4b, and 4c show a schematic view of a captured image by a videoconferencing terminal;



FIG. 5 shows a schematic perspective view of a videoconferencing terminal;



FIG. 6 shows a schematic perspective view of a videoconferencing terminal;



FIG. 7 shows a schematic view of a videoconferencing terminal;



FIG. 8 shows a flow diagram of the operation of a videoconferencing terminal; and



FIG. 9 shows a flow diagram of the operation of a videoconferencing terminal.






FIG. 1a shows a perspective view of a schematic perspective view of a videoconferencing terminal 100. The videoconferencing terminal 100 comprises at least one camera 102 positioned behind a display 104. The display 104 is configured to display an image 500 of a remote user to a local user 106 who is positioned in front of the display 104.


The local user 106 is positioned in close proximity to the videoconferencing terminal 100 and the camera 102 is configured to capture on or more images, and or videos of the local user 106. For example, the local user 106 is in the same room as the videoconferencing terminal 100. In contrast, the remote user is not in close proximity to the videoconferencing terminal 100 or the local user 106 and the video stream and/or images of the local user 106 are transmitted to a videoconferencing terminal (not shown) associated with the remote user.


In the embodiments described with reference to the Figures there are two users a local user 106 and a remote user. In other embodiments (not shown), there may be any number of local users 106 and remote users on the videoconference call.


The process of receiving and transmitting video and image data between videoconferencing terminals 100 is carried out with respect to known techniques and will not be discussed in any further detail.


In some embodiments, the remote user has an identical videoconferencing terminal 100 to the videoconferencing terminal 100 of the local user 106. However, this is not necessary and only one of the users participating in the videoconference can have the videoconferencing terminal 100 according to the embodiments described in reference to the Figures. In a preferred embodiment, all users participating in the videoconference have a videoconferencing terminal 100 according to the embodiments.



FIG. 1b shows a schematic side view of a videoconferencing terminal 100. The camera 102 comprises an axis A-A which is in some embodiments arranged substantially perpendicular to the plane of the surface of the display 104. FIG. 1b shows that the axis A-A is in alignment with the eyes 108 of the local user 106. In this way, axis A-A is an “eye-contact” axis. In this arrangement, the local user 106 is looking directly along the axis of the camera 102. This means that the camera 102 will capture an image or a video of the local user 106 looking directly at the camera 102. This means the remote user will receive an image of the local user 106 with the eyes 108 of the local user in the correct direction to simulate a face to face meeting. In some alternative embodiments, the camera 102 is moveable with respect to the display 104 and the axis of the camera 102 can be positioned at an angle with respect to the plane of the display 104.


Whilst FIGS. 1a and 1b show one camera 102, in some embodiments there can be a plurality of cameras 102 for capturing and image or a video of a plurality of local users 106 or for capturing an image of a video of a large room. The embodiments described hereinafter are only described in reference to using one camera, but some embodiments use a plurality of cameras 102 are used instead. The camera 102 as shown in FIG. 1 is static and positioned in the centre of the display 104. However, in some embodiments, the camera 102 is moveable with respect to the display 104.


The display 104 in some embodiments is a transparent OLED display 104. The display 104 is substantially planar and can be any suitable size for the videoconferencing call. In other embodiments any other suitable transparent display can be used. For example, infrared cameras (not shown) can be used and the infrared cameras can see the local user 106 through the display 104. In this way, the display 104 is transmissive to electromagnetic radiation which can be in the visible spectrum, near visible, infrared or ultraviolet or any other suitable frequency of electromagnetic radiation.


Turning to FIG. 7, the videoconferencing terminal 100 will be described in further detail. FIG. 7 shows a schematic view of a videoconferencing terminal 100 according to some embodiments.


As previously mentioned, the videoconferencing terminal 100 comprises a camera 102 and a display 104. The videoconferencing terminal 100 selectively controls the activation of the camera 102 and the display 104. As shown in FIG. 7, the camera 102 and the display 104 are controlled by a camera controller 702 and a display controller 704 respectively.


The videoconferencing terminal 100 comprises a videoconferencing controller 700. The videoconferencing controller 700, the camera controller 702 and the display controller 704 may be configured as separate units, or they may be incorporated in a single unit.


The videoconferencing controller 700 comprises a plurality of modules for processing the videos and images received from a remotely from an interface 706 and videos and images captured locally. The interface 706 and the method of transmitted and receiving videoconferencing data is known and will not be discussed any further. In some embodiments, the videoconferencing controller 700 comprises a face detection module 710 for detecting facial features and an image processing module 712 for modifying an image to be displayed on the display 104. The face detection module 710 and the image processing module 712 will be discussed in further detail below.


One or all of the videoconferencing controller 700, the camera controller 702 and the display controller 704 may be at least partially implemented by software executed by a processing unit 714. The face detection modules 710 and the image processing modules 712 may be configured as separate units, or they may be incorporated in a single unit. One or both of the modules 710, 712 may be at least partially implemented by software executed by the processing unit 714.


The processing unit 714 may be implemented by special-purpose software (or firmware) run on one or more general-purpose or special-purpose computing devices. In this context, it is to be understood that each “element” or “means” of such a computing device refers to a conceptual equivalent of a method step; there is not always a one-to-one correspondence between elements/means and particular pieces of hardware or software routines. One piece of hardware sometimes comprises different means/elements. For example, a processing unit 714 may serve as one element/means when executing one instruction but serve as another element/means when executing another instruction. In addition, one element/means may be implemented by one instruction in some cases, but by a plurality of instructions in some other cases. Naturally, it is conceivable that one or more elements (means) are implemented entirely by analogue hardware components.


The processing unit 714 may include one or more processing units, e.g. a CPU (“Central Processing Unit”), a DSP (“Digital Signal Processor”), an ASIC (“Application-Specific Integrated Circuit”), discrete analogue and/or digital components, or some other programmable logical device, such as an FPGA (“Field Programmable Gate Array”). The processing unit 714 may further include a system memory and a system bus that couples various system components including the system memory to the processing unit. The system bus may be any of several types of bus structures including a memory bus or memory controller, a peripheral bus, and a local bus using any of a variety of bus architectures. The system memory may include computer storage media in the form of volatile and/or non-volatile memory such as read only memory (ROM), random access memory (RAM) and flash memory. The special-purpose software and associated control parameter values may be stored in the system memory, or on other removable/non-removable volatile/non-volatile computer storage media which is included in or accessible to the computing device, such as magnetic media, optical media, flash memory cards, digital tape, solid state RAM, solid state ROM, etc. The processing unit 714 may include one or more communication interfaces, such as a serial interface, a parallel interface, a USB interface, a wireless interface, a network adapter, etc, as well as one or more data acquisition devices, such as an A/D converter. The special-purpose software may be provided to the processing unit 714 on any suitable computer-readable medium, including a record medium, and a read-only memory.



FIGS. 1a and 1b show the videoconferencing terminal 100 which is operating optimally and the remote user and the local user 106 can make eye contact. However, calibration of the videoconferencing terminal 100 and dynamic modification of the displayed image 500 may be required in order for the local user 106 to experience a good connected feel during a video conference call.


Calibration of the videoconferencing terminal 100 will now be discussed in reference to FIGS. 2, 3, 4a, 4b, 4c and 9. FIG. 2 shows a schematic cross-sectional side view of a videoconferencing terminal. FIG. 3 shows a schematic perspective view of a videoconferencing terminal. FIG. 4a, 4b, and 4c show a schematic view of a processing sequence for a captured camera image 400 on the videoconferencing terminal 100. FIG. 9 shows a flow diagram of the operation of a videoconferencing terminal.


During operation of the camera 102 and the display 104 the videoconferencing controller 700 can optionally interleave operation of the camera 102 and the display 104. In this way, the camera 102 and the display 104 sequentially operate so that the camera 102 captures an image of the local user 106 when the display 104 is off. Likewise, the camera 102 is not capturing an image when the display 106 is displaying an image. For example, the camera 102 can be turned off or the shutter is closed when not capturing an image of the local user 106. This means that the camera 102 takes an image when the display 104 is dark. As mentioned previously, in some embodiments the display is an OLED display. The OLED display has a low persistence, and this reduces pixel artifacts 300 which are received and captured by the camera 102 originating from the display 104.


However, the camera 102 may still receive light from pixel artifacts 300 from the display 104. This can be a function of the display image 500 being displayed on the display 104 as well as the properties of the display 104 itself. Turning to FIG. 2, the display 104 will be described in further detail. The display 104 comprises an LED matrix 200 of selectively operable pixels 202. For the purposes of clarity, only one pixel 202 has been labelled in FIG. 2. The LED matrix 200 can comprise any number of pixels 202 to achieve the required resolution for the videoconferencing call. An optically transmissive cover 204 such as a glass sheet, a transparent film or another clear medium is placed over the LED matrix 200. In some circumstances, one or more light rays B can be reflected back from the optically transmissive cover 204 towards the camera 102.


In some embodiments, the videoconferencing controller 700 is configured to determine one or more pixel artifacts 300 captured by the at least one camera 102 from the display 104 as shown in 900 of FIG. 9. Once the pixel artifacts 300 have been determined, the videoconferencing controller 700 is configured to compensate the captured camera image 400 to remove the mapped one or more pixel artifacts 300. FIG. 3 shows a perspective schematic representation of the video conferencing terminal 100. The display 104 is shown with exemplary pixel artifacts 300 and occlusion artifacts 302 on the display 104. FIG. 4a shows the captured camera image 400 including a local user captured image 406 of the local user 106 together with the pixel artifacts 300 and/or occlusion artifacts 302. Whilst the pixel artifacts 300 and occlusion artifacts 302 are represented by a series of vertical lines, the pixel artifacts 300 and occlusion artifacts 302 can be any distribution across the display 104.


In some embodiments, in order to compensate for the pixel artifacts 300 from the display 104 in the captured camera image 400, the contribution from each pixel 202 of the display 104 in the captured camera image 400 is determined as shown in step 900. Optionally, this is achieved with per-pixel information of the LED matrix 200 which maps the pixel output to the contribution as a pixel artifact map 402 in the captured camera image 400.


The pixel output is a function of the digital RGB (red green blue) colour output of the display image 500 and properties of the display 104. The videoconferencing controller 700 uses information relating to displayed image 500 and the display 104 properties and determines each display pixel's contribution in the captured camera image 400. In this way, the videoconferencing controller 700 determines a pixel artefact map 402 as shown in FIG. 4b.


The videoconferencing controller 700 then subtracts the contribution of all display pixels 202 in the pixel artifact map 402 to obtain a compensated camera image 404 as shown in FIG. 4c and step 902 of FIG. 9. The videoconferencing controller 700 then determines the compensated camera image 404 as it would have looked without any light contribution of pixel artifacts 300 from the pixels 202. The compensated camera image 404 comprises the local user captured image 406 as well.


The videoconferencing controller 700 receives information relating to the digital RGB colours of the display image 500 sent to the display 104. This means that the information relating to the digital RGB colours are directly available to the videoconferencing controller 700 for carrying out the compensation algorithm as shown in FIG. 9.


In some embodiments, the videoconferencing controller 700 optionally determines the display 104 properties can be determined in a calibration step. In the calibration step the videoconferencing controller 700 selectively controls the LED matrix 200 to light up each pixel 202 individually, at different illumination levels, to learn the mapping from digital RGB colour output to contribution in the captured camera image 400.


After the display pixel artifacts 300 have been removed, in some circumstances the captured camera image 400 may still have occlusion artifacts 302 in the captured camera image 400 from elements of the display 104. The occlusion artifacts 302 arise from one or more elements of the display 104 in front of the camera 102 which blocks light from the local user 106. The occlusion artifacts 302 can be described as having an occlusion factor between 0.0 and 1.0 wherein 0.0 indicates total occlusion and 1.0 indicates no occlusion.


In some embodiments, the videoconferencing controller 700 determines the occlusion factors of the occlusion artifacts 302 in a calibration step, when the camera 102 is directed at a uniform (e.g., all white) and evenly illuminated target. This means that the camera image pixel levels are uniform if no occlusions artifacts 302 are present.



FIG. 4b also represents the determined occlusion artifact map 408 of occlusion artifacts 302 on the display occluded image after the calibration step. As mentioned above, in the calibration step the camera 102 is looking at a smooth white surface. The videoconferencing controller 700 determines the maximum pixel level of a particular pixel 202 in the LED matrix 200. For each other pixel in the LED matrix 200, the videoconferencing controller 700 divides its pixel value by the maximum pixel value to get the occlusion factor for each particular pixel 200.


In this way, the videoconferencing controller 700 sets, a notional “correct” level to be the one of the maximum pixels. The videoconferencing controller 700 implicitly assumes that the maximum pixel is unoccluded. If this is not the case, the effect is a uniformly darker image, but this is not an effect that is apparent to the local user 106, and not experienced as a significant artifact. Accordingly, the videoconferencing controller 700 determines on or more occlusion artifacts 302 as shown in step 904 of FIG. 9.


In a similar way, it may be the case that the target and illumination properties during calibration are such that the ideal, unoccluded, image is not uniform, but has slight variations. Typically, such variations are of low spatial frequency, and will cause low frequency artifacts in the compensated results that are either not noticeable at all to the user or not experienced as significant artifacts to the local user 106.


The videoconferencing controller 700 assumes that occlusions are not severe enough to completely occlude parts of a camera pixel (not shown) (e.g. occlusion factor 0.0), but only occlude parts of the incoming light, for each camera pixel. In some embodiments, at least some of the occluding display elements are out-of-focus. In some embodiments, the optics of the camera 102 are designed to keep occluding display elements are out-of-focus.


The videoconferencing controller 700 then multiples the “correct”, “unoccluded”, pixel value is multiplied by. 0.0 gives total occlusion and 1.0 no occlusion. In this way by having information relating to the occlusion factor for each pixel 202, the videoconferencing controller 700 can determine the compensated camera image 404 according to step 906 in FIG. 9 by dividing each pixel value by its occlusion factor, obtaining an unoccluded and compensated camera image 404 as shown in FIG. 4c.


Optionally the steps 900, 902 relating to the compensation of the pixel artifacts 300 and steps 904, 906 relating to the compensation of the occlusion artifacts 302 can be carried out in a different order than as show in FIG. 9. Furthermore, optionally one, some or all of the steps 900, 902 relating to the compensation of the pixel artifacts 300 and steps 904, 906 relating to the compensation of the occlusion artifacts 302 can be omitted. For example, compensation for pixel artifacts 300 can be omitted. Likewise, additionally or alternatively, compensation for occlusion artifacts 302 can be omitted.


Steps 900, 902, 904, 906 are dependent on the position of the camera 102 with respect to the display 104. Accordingly, the compensation of the pixel artifacts 300 and compensation for occlusion artifacts 302 is based on the relative position of the camera 102 with respect to the display 104. This means that if the camera 102 moves with respect to the display 104, one or more of the steps as shown in FIG. 9 are repeated to recalibrate the video conferencing terminal 100. In this way, videoconferencing controller 700 modifies an image based on the camera position of the at least one camera 102 with respect to the display.


Another embodiment will now be described in reference to FIGS. 5, 6 and 8. FIGS. 5 and 6 show a schematic perspective view of a videoconferencing terminal 100 and FIG. 8 shows a flow diagram of the operation of a videoconferencing terminal. Optionally, the method steps discussed with respect to FIG. 9 can be used together with the method steps in FIG. 8, but this is not necessary.


Turning to FIG. 5, again the axis A-A of the camera 102 is in alignment with the eyes 108 of the local user 106. In FIG. 5 the eyes 108 of the local user 106 are aligned with eyes 502 of the displayed image 500 of the remote user. Accordingly, the local user 106 and the remote user are able to make direct eye contact.


As can be seen from FIG. 5, if the local user 106 moves with respect to the display 104, the local user 106 is no longer aligned with the axis A-A of the camera 102. FIG. 5 shows one possible new position of the local user 106 represented by a dotted outline. In the new position, the local user's 106 line of sight B-B is still focused on the eyes 502 of the displayed image 500 of the remote user. However, the local user 106 is no longer looking directly at the camera 102 due the parallax error introduced by the local user 106 also moving with respect to the camera 102. This means that the captured camera image 400 of the local user 106 will not be looking directly at the camera 102.


However, FIG. 6 shows the local user 106 in the new position shown in FIG. 5. Here the position of the local user 106 is offset by a distance D1 from the axis A-A of the camera 102. This means that the eyes 108 of the local user 106 have moved from the axis A-A by a distance D1. Specifically, as shown in FIG. 6, the local user 106 is lower than the axis A-A. However, in other embodiments the local user 106 can be offset from the axis A-A of the camera 102 in any direction. For example, the local user 106 may have moved sideways with respect to the axis A-A or may be standing and the eyes 108 of the local user are above the axis A-A.


The videoconferencing controller 700 sends the image 500 of the remote user to be displayed to the face detection module 710. The face detection module 710 determines the position of the eyes 502 of the displayed image 500 of the remote user as shown in step 800 in FIG. 8. The face detection module 710 uses feature detection on an image 500 of the remote user to detect where the eyes 502 of the displayed image 500 of the remote user. The face detection module 710 then sends position information of the eyes 502 of the displayed image 500 of the remote user to the videoconferencing controller 700.


Then the videoconferencing controller 700 determines the position of the camera 102 with respect to the display 104. If the camera 102 is fixed with respect to the display 104, the videoconferencing controller 700 can store the position of the camera 102 and the axis of the camera 102 in memory.


Alternatively, the videoconferencing controller 700 can determine the relative position of the camera 102 with respect to the display 104 based on movement information of the camera 102. For example, the videoconferencing controller 700 determines the position of the camera 102 from servo information on a mechanism for moving the camera 102. Alternatively, the videoconferencing controller 700 determines the position of the camera 102 based on reference points in the captured camera image 400. For example, a reference point could be a QR code fixed to a wall behind the local user 106. In this way, the videoconferencing controller 700 determines the position and orientation of the camera 102 and the axis A-A of the camera 102 as shown in step 802 of FIG. 8.


Then the videoconferencing controller 700 sends a captured camera image 400 of the local user 106 to the face detection module 710. The face detection module 710 determines the position of the eyes 108 of the local user in the image 400 as shown in step 804 in FIG. 8. The face detection module 710 uses feature detection on the image 400 of the local user 106 to detect where the eyes 108 are in the image 400. This is similar to the step 800 in FIG. 8 for determining the position of the eyes 502 of the displayed image 500 of the remote user.


The videoconferencing controller 700 then determines a position of the eyes 108 of the local user 106 with respect to the display 104. Based on the determined position of the camera 102, the videoconferencing controller 700 determines an offset D1 between the position of the eyes 108 of the local user 106 and an axis A-A of the at least one camera 102. In this way, the videoconferencing controller 700 determines how much the local user 106 has moved from the axis A-A of the camera 102. This means that the videoconferencing controller 700 determines, a new axis A′-A′ of the camera 102 based on a light ray from the new position of the local user 106 and the position of the camera 102. Accordingly, A′-A′ is the new eye contact axis.


The videoconferencing controller 700 determines a position of the eyes 502 of the displayed image 500 of the remote user with respect to the display 104. That is, the videoconferencing controller 700 determines where the image 500 would be positioned on the display 104 with no modification to the image 500.


The videoconferencing controller 700 then determines whether the position of the eyes 502 of the displayed image 500 of the remote user is offset D2 from the new axis A′-A based on the new position of the local user 106. If the videoconferencing controller 700 determines that the displayed image 500 is offset greater than a predetermined threshold, the videoconferencing controller 700 sends an instruction to the image processing module 712 to modify the image 500 as show in step 806 in FIG. 8. In FIG. 6, the eyes 502 of the displayed image 500 of the remote user are translated downwards by a distance of D2 to intersect the new axis A′-A′.


In some embodiments, the videoconferencing controller 700 instructs the image processing module 712 to modify the image 500 when the new position of the local user 106 requires the local user 106 to adjust their line of sight through an arc having an angle greater than 10 degrees. In some embodiments, the image processing module 712 to modifies the image 500 when the local user 106 adjusts their line of sight through an arc having an angle greater than 10 degrees in a horizontal and/or a vertical directions from the axis A-A. In this way, if the local user 106 is required to move their head or the eyes 108 of the local user to maintain eye contact with the eyes 502 of the displayed image 500 of the remote user, the videoconferencing controller 700 modifies the image 500 and returns modified image 600. This means that there is no parallax error that prevents direct eye contact between the local user 106 and the remote user because the videoconferencing controller 700 modifies an image based on the position of the camera 102 and the local user 106 with respect to the displayed image 500.


In some embodiments, the videoconferencing controller 700 sends an instruction that a co-ordinate corresponding to the centre of the eyes 502 of the displayed image 500 of the remote user is translated to a new position. The image processing module 712 returns a modified image 600 to the videoconferencing controller 700. The modified image 600 of the remote user is shown in FIG. 6.


In this way, the eyes 502 of the displayed image 500 of the remote user are moved to intersect with the new axis A′-A′. In this way, the image processing module 712 modifies the image 500 such that the eyes 502 of the displayed image 500 of the remote user intersect with the new axis A′-A′. In the new position, the local user's 106 line of sight B-B is focused on the eyes 502 of the displayed image 500 of the remote user and aligned with the new axis A′-A′. In some embodiments, the image processing module 712 modifies the image 500 by translating, scaling, or transforming or any other suitable image modification to move the position of the eyes 502 of the displayed image 500 of the remote user.


In this way, videoconferencing controller 700 modifies an image based on the camera position of the at least one camera 102 with respect to the display 104 and on the user position of the local user 106 with respect to the display 104.


As mentioned above, in some embodiments, there is only one video conferencing terminal 100 with a videoconferencing controller 700 and the image processing module 712 as discussed with reference to the previous embodiments. In these embodiments, the videoconferencing controller 700 performs the image processing as discussed with reference to embodiments as shown in the Figures e.g. FIGS. 8 and 9 for both the local video conferencing terminal 100 and the remote video conferencing terminal. This means that the advantages of the invention can be achieved for both sides of the video conference with only one video conferencing terminal 100, e.g. the local video conferencing terminal 100, according to the present invention.


When the local video conferencing terminal 100 is modifying the image for both the local and the remote video conferencing terminals 100, the videoconferencing controller 700 performs the methods described with references to the Figures for both local and the remote video conferencing terminals. The local videoconferencing controller 700 then sends instructions for modifying the displayed image to the remote video conferencing terminal. For example, translation coordinates for modifying the displayed image on the remote video conferencing terminal are sent by the local video conferencing controller 700 to the remote video conferencing terminal 100.


In another embodiment two or more embodiments are combined. Features of one embodiment can be combined with features of other embodiments.


Embodiments of the present invention have been discussed with particular reference to the examples illustrated. However it will be appreciated that variations and modifications may be made to the examples described within the scope of the invention.

Claims
  • 1. A method of videoconferencing comprising: displaying an image of a remote user on a display, wherein the display comprises an LED matrix;capturing an image of a local user at a user position in front of the display, with at least one camera being located at a camera position behind the display;calibrating the image of the local user by capturing a camera image including a local user captured image of the local user together with pixel artifacts and occlusion artifacts by: determining one or more pixel artifacts captured by the at least one camera by receiving light from pixels of the display,compensating the captured camera image to remove the determined one or more pixel artifacts,determining one or more occlusion artifacts from one or more display elements, andcompensating the captured camera image to remove the one or more occlusion artifacts,wherein the compensated captured camera image comprises the local user captured image; andmodifying the image of the remote user to be displayed based on the camera position of the at least one camera with respect to the display and based on the user position of the local user with respect to the display.
  • 2. A method according to claim 1 wherein the method comprises determining a position of an eye of the local user with respect to the display.
  • 3. A method according to claim 2 wherein the method comprises determining an axis of the at least one camera based on the position of the eyes of the local user.
  • 4. A method according to claim 1 wherein the method comprises determining a position of an eye of the remote user with respect to the display.
  • 5. A method according to claim 4 wherein the method comprises determining an offset between the axis of the camera and an eye of the remote user in a displayed image.
  • 6. A method according to claim 5 wherein the modifying comprises translating the image of the remote user to be displayed such that the displayed eyes of the remote user intersect with the axis.
  • 7. A method according to claim 1 wherein occluding display elements are out of focus.
  • 8. A method according to claim 1 wherein the user position of the user and/or the camera position of the at least one camera is moveable with respect to the display.
  • 9. A method according to claim 1 wherein the at least one camera is one or more of the following: an RGB camera or an infrared camera.
  • 10. A method according to claim 1 wherein the display is transmissive to electromagnetic radiation.
  • 11. A method according to claim 1 wherein one or more of the steps is carried out during calibration and/or one or more of the steps is carried out during operation.
  • 12. The method according to claim 1, wherein the display is an OLED display.
  • 13. The method according to claim 1, further comprising determining a maximum pixel level of a particular pixel in the LED matrix, and, for each other pixel in the LED matrix, dividing its pixel value by the maximum pixel value to get an occlusion factor for each particular pixel.
  • 14. A videoconferencing terminal comprising: a display for displaying an image of a remote user, wherein the display comprises an LED matrix;at least one camera for capturing an image of a local user at a user position in front of the display, the camera being located at a camera position behind the display; anda controller configured to modify the image of the remote user to be displayed based on the camera position of the at least one camera with respect to the display and based on the user position of the local user with respect to the display,wherein the controller is further configured to calibrate the image of the local user by capturing a camera image including a local user captured image of the local user together with pixel artifacts and/or occlusion artifacts, wherein the calibration comprises: determining one or more pixel artifacts captured by the at least one camera by receiving light from pixels of the display,compensating the captured camera image to remove the determined one or more pixel artifacts,determining one or more occlusion artifacts from one or more display elements, andcompensating the captured camera image to remove the one or more occlusion artifacts, wherein the compensated captured camera image comprises the local user captured image.
  • 15. A method of videoconferencing comprising: displaying an image of a remote user on a display, wherein the display comprises LED matrix;capturing an image of a local user at a user position in front of the display, with at least one camera being located at a camera position behind the display;calibrating the image of the local user by capturing a camera image including a local user captured image of the local user together with pixel artifacts and/or occlusion artifacts by: determining one or more pixel artifacts captured by the at least one camera by receiving light from pixels of the display, andcompensating the captured camera image to remove the determined one or more pixel artifacts; andmodifying the image of the remote user to be displayed based on the camera position of the at least one camera with respect to the display.
Priority Claims (1)
Number Date Country Kind
1930022-7 Jan 2019 SE national
PCT Information
Filing Document Filing Date Country Kind
PCT/SE2020/050043 1/17/2020 WO
Publishing Document Publishing Date Country Kind
WO2020/153890 7/30/2020 WO A
US Referenced Citations (760)
Number Name Date Kind
3375053 Ward Mar 1968 A
3440426 Bush Apr 1969 A
3478220 Milroy Nov 1969 A
3553680 Cooreman Jan 1971 A
3673327 Johnson et al. Jun 1972 A
4129384 Walker et al. Dec 1978 A
4180702 Sick et al. Dec 1979 A
4209255 Heynau et al. Jun 1980 A
4213707 Evans, Jr. Jul 1980 A
4254333 Bergström Mar 1981 A
4254407 Tipon Mar 1981 A
4294543 Apple et al. Oct 1981 A
4346376 Mallos Aug 1982 A
4420261 Barlow et al. Dec 1983 A
4484179 Kasday Nov 1984 A
4507557 Tsikos Mar 1985 A
4521112 Kuwabara et al. Jun 1985 A
4542375 Alles et al. Sep 1985 A
4550250 Mueller et al. Oct 1985 A
4593191 Alles Jun 1986 A
4673918 Adler et al. Jun 1987 A
4688933 Lapeyre Aug 1987 A
4688993 Ferris et al. Aug 1987 A
4692809 Beining et al. Sep 1987 A
4710760 Kasday Dec 1987 A
4736191 Matzke et al. Apr 1988 A
4737626 Hasegawa Apr 1988 A
4746770 McAvinney May 1988 A
4751379 Sasaki et al. Jun 1988 A
4752655 Tajiri et al. Jun 1988 A
4766424 Adler et al. Aug 1988 A
4772763 Garwin et al. Sep 1988 A
4782328 Denlinger Nov 1988 A
4812833 Shimauchi Mar 1989 A
4837430 Hasegawa Jun 1989 A
4868550 Hiroaki et al. Sep 1989 A
4868912 Doering Sep 1989 A
4891829 Deckman et al. Jan 1990 A
4916308 Meadows Apr 1990 A
4916712 Bender Apr 1990 A
4933544 Tamaru Jun 1990 A
4949079 Loebner Aug 1990 A
4986662 Bures Jan 1991 A
4988983 Wehrer Jan 1991 A
5065185 Powers et al. Nov 1991 A
5073770 Lowbner Dec 1991 A
5105186 May Apr 1992 A
5159322 Loebner Oct 1992 A
5166668 Aoyagi Nov 1992 A
5227622 Suzuki Jul 1993 A
5248856 Mallicoat Sep 1993 A
5254407 Sergerie et al. Oct 1993 A
5345490 Finnigan et al. Sep 1994 A
5383022 Kaser Jan 1995 A
5414413 Tamaru et al. May 1995 A
5434373 Komaki Jul 1995 A
5483261 Yasutake Jan 1996 A
5484966 Segen Jan 1996 A
5499098 Ogawa Mar 1996 A
5502568 Ogawa et al. Mar 1996 A
5515083 Casebolt et al. May 1996 A
5525764 Junkins et al. Jun 1996 A
5526422 Keen Jun 1996 A
5539514 Shishido et al. Jul 1996 A
5570181 Yasuo et al. Oct 1996 A
5572251 Ogawa Nov 1996 A
5577501 Flohr et al. Nov 1996 A
5600105 Fukuzaki et al. Feb 1997 A
5608550 Epstein et al. Mar 1997 A
5672852 Fukuzaki et al. Sep 1997 A
5679930 Katsurahira Oct 1997 A
5686942 Ball Nov 1997 A
5688933 Evans et al. Nov 1997 A
5729249 Yasutake Mar 1998 A
5729250 Bishop et al. Mar 1998 A
5736686 Perret, Jr. et al. Apr 1998 A
5740224 Müller et al. Apr 1998 A
5764223 Chang et al. Jun 1998 A
5767517 Hawkins Jun 1998 A
5775792 Wiese Jul 1998 A
5945980 Moissev et al. Aug 1999 A
5945981 Paull et al. Aug 1999 A
5959617 Bird et al. Sep 1999 A
6031524 Kunert Feb 2000 A
6061177 Fujimoto May 2000 A
6067079 Shieh May 2000 A
6122394 Neukermans et al. Sep 2000 A
6141104 Schulz et al. Oct 2000 A
6172667 Sayag Jan 2001 B1
6175999 Sloan et al. Jan 2001 B1
6227667 Halldorsson et al. May 2001 B1
6229529 Yano et al. May 2001 B1
6333735 Anvekar Dec 2001 B1
6366276 Kunimatsu et al. Apr 2002 B1
6380732 Gilboa Apr 2002 B1
6380740 Laub Apr 2002 B1
6390370 Plesko May 2002 B1
6429857 Masters et al. Aug 2002 B1
6452996 Hsieh Sep 2002 B1
6476797 Kurihara et al. Nov 2002 B1
6492633 Nakazawa et al. Dec 2002 B2
6495832 Kirby Dec 2002 B1
6504143 Koops et al. Jan 2003 B2
6529327 Graindorge Mar 2003 B1
6538644 Muraoka Mar 2003 B1
6587099 Takekawa Jul 2003 B2
6648485 Colgan et al. Nov 2003 B1
6660964 Benderly Dec 2003 B1
6664498 Forsman et al. Dec 2003 B2
6664952 Iwamoto et al. Dec 2003 B2
6677934 Blanchard Jan 2004 B1
6690363 Newton Feb 2004 B2
6707027 Liess et al. Mar 2004 B2
6738051 Boyd et al. May 2004 B2
6748098 Rosenfeld Jun 2004 B1
6784948 Kawashima et al. Aug 2004 B2
6799141 Stoustrup et al. Sep 2004 B1
6806871 Yasue Oct 2004 B1
6927384 Reime et al. Aug 2005 B2
6940286 Wang et al. Sep 2005 B2
6965836 Richardson Nov 2005 B2
6972401 Akitt et al. Dec 2005 B2
6972753 Kimura et al. Dec 2005 B1
6985137 Kaikuranta Jan 2006 B2
7042444 Cok May 2006 B2
7084859 Pryor Aug 2006 B1
7087907 Lalovic et al. Aug 2006 B1
7117157 Taylor et al. Oct 2006 B1
7133031 Wang et al. Nov 2006 B2
7176904 Satoh Feb 2007 B2
7199932 Sugiura Apr 2007 B2
7359041 Xie et al. Apr 2008 B2
7397418 Doerry et al. Jul 2008 B1
7432893 Ma et al. Oct 2008 B2
7435940 Eliasson et al. Oct 2008 B2
7436443 Hirunuma et al. Oct 2008 B2
7442914 Eliasson et al. Oct 2008 B2
7465914 Eliasson et al. Dec 2008 B2
7528898 Hashimoto May 2009 B2
7613375 Shimizu Nov 2009 B2
7629968 Miller et al. Dec 2009 B2
7646833 He et al. Jan 2010 B1
7653883 Hotelling et al. Jan 2010 B2
7655901 Idzik et al. Feb 2010 B2
7705835 Eikman Apr 2010 B2
7729056 Hwang et al. Jun 2010 B2
7847789 Kolmykov-Zotov et al. Dec 2010 B2
7855716 McCreary et al. Dec 2010 B2
7859519 Tulbert Dec 2010 B2
7924272 Boer et al. Apr 2011 B2
7932899 Newton et al. Apr 2011 B2
7969410 Kakarala Jun 2011 B2
7995039 Eliasson et al. Aug 2011 B2
8013845 Ostergaard et al. Sep 2011 B2
8031186 Ostergaard Oct 2011 B2
8077147 Krah et al. Dec 2011 B2
8093545 Leong et al. Jan 2012 B2
8094136 Eliasson et al. Jan 2012 B2
8094910 Xu Jan 2012 B2
8149211 Hayakawa et al. Apr 2012 B2
8149221 Newton Apr 2012 B2
8184108 Smits May 2012 B2
8218154 Østergaard et al. Jul 2012 B2
8274495 Lee Sep 2012 B2
8314773 Low et al. Nov 2012 B2
8319729 Choi et al. Nov 2012 B2
8325158 Yatsuda et al. Dec 2012 B2
8339379 Goertz et al. Dec 2012 B2
8350827 Chung et al. Jan 2013 B2
8384010 Hong et al. Feb 2013 B2
8384693 Newton Feb 2013 B2
8407606 Davidson et al. Mar 2013 B1
8441467 Han May 2013 B2
8445834 Hong et al. May 2013 B2
8466901 Yen et al. Jun 2013 B2
8482547 Cobon et al. Jul 2013 B2
8542217 Wassvik et al. Sep 2013 B2
8567257 Van Steenberge et al. Oct 2013 B2
8581884 Fåhraeus et al. Nov 2013 B2
8624858 Fyke et al. Jan 2014 B2
8686974 Christiansson et al. Apr 2014 B2
8692807 Føhraeus et al. Apr 2014 B2
8716614 Wassvik May 2014 B2
8727581 Saccomanno May 2014 B2
8745514 Davidson Jun 2014 B1
8780066 Christiansson et al. Jul 2014 B2
8830181 Clark et al. Sep 2014 B1
8860696 Wassvik et al. Oct 2014 B2
8872098 Bergström et al. Oct 2014 B2
8872801 Bergström et al. Oct 2014 B2
8884900 Wassvik Nov 2014 B2
8890843 Wassvik et al. Nov 2014 B2
8890849 Christiansson et al. Nov 2014 B2
8928590 El Dokor Jan 2015 B1
8963886 Wassvik Feb 2015 B2
8982084 Christiansson et al. Mar 2015 B2
9001086 Saini Apr 2015 B1
9024896 Chen May 2015 B2
9024916 Christiansson May 2015 B2
9035909 Christiansson May 2015 B2
9063614 Petterson et al. Jun 2015 B2
9063617 Eliasson et al. Jun 2015 B2
9086763 Johansson et al. Jul 2015 B2
9134854 Wassvik et al. Sep 2015 B2
9158401 Christiansson Oct 2015 B2
9158415 Song et al. Oct 2015 B2
9201520 Benko et al. Dec 2015 B2
9207800 Eriksson et al. Dec 2015 B1
9213445 King et al. Dec 2015 B2
9274645 Christiansson et al. Mar 2016 B2
9280237 Kukulj Mar 2016 B2
9291845 Shin et al. Mar 2016 B2
9317146 Hufnagel Apr 2016 B1
9317168 Christiansson et al. Apr 2016 B2
9323396 Han et al. Apr 2016 B2
9366565 Uvnäs Jun 2016 B2
9366802 Lee et al. Jun 2016 B2
9377884 Christiansson et al. Jun 2016 B2
9389732 Craven-Bartle Jul 2016 B2
9411444 Christiansson et al. Aug 2016 B2
9411464 Wallander et al. Aug 2016 B2
9430079 Christiansson et al. Aug 2016 B2
9442574 Fåhraeus et al. Sep 2016 B2
9547393 Christiansson et al. Jan 2017 B2
9552103 Craven-Bartle et al. Jan 2017 B2
9557846 Baharav et al. Jan 2017 B2
9588619 Christiansson et al. Mar 2017 B2
9594467 Christiansson et al. Mar 2017 B2
9618682 Yoon et al. Apr 2017 B2
9626018 Christiansson et al. Apr 2017 B2
9626040 Wallander et al. Apr 2017 B2
9639210 Wallander et al. May 2017 B2
9645679 Eriksson et al. May 2017 B2
9678602 Wallander Jun 2017 B2
9684414 Christiansson et al. Jun 2017 B2
9710101 Christiansson et al. Jul 2017 B2
9874978 Wall Jan 2018 B2
9983626 Cao et al. May 2018 B2
10013107 Christiansson et al. Jul 2018 B2
10019113 Christiansson et al. Jul 2018 B2
10168835 Wallander et al. Jan 2019 B2
10282035 Kocovksi et al. May 2019 B2
10437389 Skagmo et al. Oct 2019 B2
10579227 Bura et al. Mar 2020 B1
10606416 Skagmo et al. Mar 2020 B2
10649585 van Beek et al. May 2020 B1
10775937 Christiansson et al. Sep 2020 B2
10884275 Yang et al. Jan 2021 B2
11256371 Bergstrom et al. Feb 2022 B2
11567610 Bergstrom et al. Jan 2023 B2
20010002694 Nakazawa et al. Jun 2001 A1
20010005004 Shiratsuki et al. Jun 2001 A1
20010005308 Oishi et al. Jun 2001 A1
20010030642 Sullivan et al. Oct 2001 A1
20010055411 Black Dec 2001 A1
20020067348 Masters et al. Jun 2002 A1
20020075243 Newton Jun 2002 A1
20020118177 Newton Aug 2002 A1
20020158823 Zavracky et al. Oct 2002 A1
20020158853 Sugawara et al. Oct 2002 A1
20020163505 Takekawa Nov 2002 A1
20030016450 Bluemel et al. Jan 2003 A1
20030034439 Reime et al. Feb 2003 A1
20030034935 Amanai et al. Feb 2003 A1
20030048257 Mattila Mar 2003 A1
20030052257 Sumriddetchkajorn Mar 2003 A1
20030095399 Grenda et al. May 2003 A1
20030107748 Lee Jun 2003 A1
20030137494 Tulbert Jul 2003 A1
20030156100 Gettemy Aug 2003 A1
20030160155 Liess Aug 2003 A1
20030210537 Engelmann Nov 2003 A1
20030214486 Roberts Nov 2003 A1
20040027339 Schulz Feb 2004 A1
20040032401 Nakazawa et al. Feb 2004 A1
20040090432 Takahashi et al. May 2004 A1
20040130338 Wang et al. Jul 2004 A1
20040174541 Freifeld Sep 2004 A1
20040201579 Graham Oct 2004 A1
20040212603 Cok Oct 2004 A1
20040238627 Silverbrook et al. Dec 2004 A1
20040239702 Kang et al. Dec 2004 A1
20040245438 Payne et al. Dec 2004 A1
20040252091 Ma et al. Dec 2004 A1
20040252867 Lan et al. Dec 2004 A1
20050012714 Russo et al. Jan 2005 A1
20050041013 Tanaka Feb 2005 A1
20050057903 Choi Mar 2005 A1
20050073508 Pittel et al. Apr 2005 A1
20050083293 Dixon Apr 2005 A1
20050128190 Ryynanen Jun 2005 A1
20050143923 Keers et al. Jun 2005 A1
20050156914 Lipman et al. Jul 2005 A1
20050162398 Eliasson et al. Jul 2005 A1
20050179977 Chui et al. Aug 2005 A1
20050200613 Kobayashi et al. Sep 2005 A1
20050212774 Ho et al. Sep 2005 A1
20050248540 Newton Nov 2005 A1
20050253834 Sakamaki et al. Nov 2005 A1
20050276053 Nortrup et al. Dec 2005 A1
20060001650 Robbins et al. Jan 2006 A1
20060001653 Smits Jan 2006 A1
20060007185 Kobayashi Jan 2006 A1
20060008164 Wu et al. Jan 2006 A1
20060017706 Cutherell et al. Jan 2006 A1
20060017709 Okano Jan 2006 A1
20060033725 Marggraff et al. Feb 2006 A1
20060038698 Chen Feb 2006 A1
20060061861 Munro et al. Mar 2006 A1
20060114237 Crockett et al. Jun 2006 A1
20060132454 Chen et al. Jun 2006 A1
20060139340 Geaghan Jun 2006 A1
20060158437 Blythe et al. Jul 2006 A1
20060161871 Hotelling et al. Jul 2006 A1
20060170658 Nakamura et al. Aug 2006 A1
20060202974 Thielman Sep 2006 A1
20060227120 Eikman Oct 2006 A1
20060255248 Eliasson Nov 2006 A1
20060256092 Lee Nov 2006 A1
20060279558 Van Delden et al. Dec 2006 A1
20060281543 Sutton et al. Dec 2006 A1
20060290684 Giraldo et al. Dec 2006 A1
20070014486 Schiwietz et al. Jan 2007 A1
20070024598 Miller et al. Feb 2007 A1
20070034783 Eliasson et al. Feb 2007 A1
20070038691 Candes et al. Feb 2007 A1
20070052684 Gruhlke et al. Mar 2007 A1
20070070056 Sato et al. Mar 2007 A1
20070075648 Blythe et al. Apr 2007 A1
20070120833 Yamaguchi et al. May 2007 A1
20070125937 Eliasson et al. Jun 2007 A1
20070152985 Ostergaard et al. Jul 2007 A1
20070201042 Eliasson et al. Aug 2007 A1
20070296688 Nakamura et al. Dec 2007 A1
20080006766 Oon et al. Jan 2008 A1
20080007540 Ostergaard Jan 2008 A1
20080007541 Eliasson et al. Jan 2008 A1
20080007542 Eliasson et al. Jan 2008 A1
20080011944 Chua et al. Jan 2008 A1
20080029691 Han Feb 2008 A1
20080036743 Westerman et al. Feb 2008 A1
20080062150 Lee Mar 2008 A1
20080068691 Miyatake Mar 2008 A1
20080074401 Chung et al. Mar 2008 A1
20080080811 Deane Apr 2008 A1
20080088603 Eliasson et al. Apr 2008 A1
20080121442 Boer et al. May 2008 A1
20080122792 Izadi et al. May 2008 A1
20080122803 Izadi et al. May 2008 A1
20080130979 Run et al. Jun 2008 A1
20080133265 Silkaitis et al. Jun 2008 A1
20080150846 Chung et al. Jun 2008 A1
20080150848 Chung et al. Jun 2008 A1
20080151126 Yu Jun 2008 A1
20080158176 Land et al. Jul 2008 A1
20080189046 Eliasson et al. Aug 2008 A1
20080192025 Jaeger et al. Aug 2008 A1
20080238433 Joutsenoja et al. Oct 2008 A1
20080246388 Cheon et al. Oct 2008 A1
20080252619 Crockett et al. Oct 2008 A1
20080266266 Kent et al. Oct 2008 A1
20080278460 Arnett et al. Nov 2008 A1
20080284925 Han Nov 2008 A1
20080291668 Aylward et al. Nov 2008 A1
20080297482 Weiss Dec 2008 A1
20090000831 Miller et al. Jan 2009 A1
20090002340 Van Genechten Jan 2009 A1
20090006292 Block Jan 2009 A1
20090040786 Mori Feb 2009 A1
20090058832 Newton Mar 2009 A1
20090066647 Kerr et al. Mar 2009 A1
20090067178 Huang et al. Mar 2009 A1
20090073142 Yamashita et al. Mar 2009 A1
20090077501 Partridge et al. Mar 2009 A1
20090085894 Gandhi et al. Apr 2009 A1
20090091554 Keam Apr 2009 A1
20090115919 Tanaka et al. May 2009 A1
20090122020 Eliasson et al. May 2009 A1
20090122027 Newton May 2009 A1
20090128508 Sohn et al. May 2009 A1
20090135162 Van De Wijdeven et al. May 2009 A1
20090143141 Wells et al. Jun 2009 A1
20090153519 Suarez Rovere Jun 2009 A1
20090161026 Wu et al. Jun 2009 A1
20090168459 Holman et al. Jul 2009 A1
20090187842 Collins et al. Jul 2009 A1
20090189857 Benko et al. Jul 2009 A1
20090189874 Chene et al. Jul 2009 A1
20090189878 Goertz et al. Jul 2009 A1
20090219256 Newton Sep 2009 A1
20090229892 Fisher et al. Sep 2009 A1
20090251439 Westerman et al. Oct 2009 A1
20090254869 Ludwig et al. Oct 2009 A1
20090256817 Perlin et al. Oct 2009 A1
20090259967 Davidson et al. Oct 2009 A1
20090267919 Chao et al. Oct 2009 A1
20090273794 Østergaard et al. Nov 2009 A1
20090278816 Colson Nov 2009 A1
20090278913 Rosenfeld et al. Nov 2009 A1
20090297009 Xu et al. Dec 2009 A1
20100033444 Kobayashi Feb 2010 A1
20100045629 Newton Feb 2010 A1
20100060896 Van De Wijdeven et al. Mar 2010 A1
20100066016 Van De Wijdeven et al. Mar 2010 A1
20100066704 Kasai Mar 2010 A1
20100073318 Hu et al. Mar 2010 A1
20100073327 Mau et al. Mar 2010 A1
20100078545 Leong et al. Apr 2010 A1
20100079407 Suggs et al. Apr 2010 A1
20100079408 Leong et al. Apr 2010 A1
20100097345 Jang et al. Apr 2010 A1
20100097348 Park et al. Apr 2010 A1
20100097353 Newton Apr 2010 A1
20100103133 Park et al. Apr 2010 A1
20100125438 Audet May 2010 A1
20100127975 Jensen May 2010 A1
20100134435 Kimura et al. Jun 2010 A1
20100142823 Wang et al. Jun 2010 A1
20100187422 Kothari et al. Jul 2010 A1
20100193259 Wassvik Aug 2010 A1
20100207874 Yuxin et al. Aug 2010 A1
20100229091 Homma et al. Sep 2010 A1
20100238139 Goertz et al. Sep 2010 A1
20100245292 Wu Sep 2010 A1
20100265170 Norieda Oct 2010 A1
20100277436 Feng et al. Nov 2010 A1
20100283785 Satulovsky Nov 2010 A1
20100284596 Miao et al. Nov 2010 A1
20100289754 Sleeman et al. Nov 2010 A1
20100295821 Chang et al. Nov 2010 A1
20100302196 Han et al. Dec 2010 A1
20100302209 Large Dec 2010 A1
20100302210 Han et al. Dec 2010 A1
20100302240 Lettvin Dec 2010 A1
20100315379 Allard et al. Dec 2010 A1
20100321328 Chang et al. Dec 2010 A1
20100322550 Trott Dec 2010 A1
20110043490 Powell et al. Feb 2011 A1
20110049388 Delaney et al. Mar 2011 A1
20110050649 Newton et al. Mar 2011 A1
20110051394 Bailey Mar 2011 A1
20110068256 Hong et al. Mar 2011 A1
20110069039 Lee et al. Mar 2011 A1
20110069807 Dennerlein et al. Mar 2011 A1
20110074725 Westerman et al. Mar 2011 A1
20110074734 Wassvik et al. Mar 2011 A1
20110074735 Wassvik et al. Mar 2011 A1
20110080361 Miller et al. Apr 2011 A1
20110084939 Gepner et al. Apr 2011 A1
20110090176 Christiansson et al. Apr 2011 A1
20110102374 Wassvik et al. May 2011 A1
20110102538 Tan May 2011 A1
20110115748 Xu May 2011 A1
20110121323 Wu et al. May 2011 A1
20110122075 Seo et al. May 2011 A1
20110122091 King et al. May 2011 A1
20110122094 Tsang et al. May 2011 A1
20110134079 Stark Jun 2011 A1
20110141062 Yu et al. Jun 2011 A1
20110147569 Drumm Jun 2011 A1
20110157095 Drumm Jun 2011 A1
20110157096 Drumm Jun 2011 A1
20110163996 Wassvik et al. Jul 2011 A1
20110163997 Kim Jul 2011 A1
20110163998 Goertz et al. Jul 2011 A1
20110169780 Goertz et al. Jul 2011 A1
20110175852 Goertz et al. Jul 2011 A1
20110181552 Magnus et al. Jul 2011 A1
20110205186 Newton et al. Aug 2011 A1
20110205189 Newton Aug 2011 A1
20110210946 Goertz et al. Sep 2011 A1
20110216042 Wassvik et al. Sep 2011 A1
20110221705 Yi et al. Sep 2011 A1
20110221997 Kim et al. Sep 2011 A1
20110227036 Vaufrey Sep 2011 A1
20110227874 Fåhraeus et al. Sep 2011 A1
20110234537 Kim et al. Sep 2011 A1
20110254864 Tsuchikawa et al. Oct 2011 A1
20110261020 Song et al. Oct 2011 A1
20110267296 Noguchi et al. Nov 2011 A1
20110291944 Simmons et al. Dec 2011 A1
20110291989 Lee Dec 2011 A1
20110298743 Machida et al. Dec 2011 A1
20110309325 Park et al. Dec 2011 A1
20110310045 Toda et al. Dec 2011 A1
20110316005 Murao et al. Dec 2011 A1
20120007835 Yu-Jen et al. Jan 2012 A1
20120019448 Pitkanen et al. Jan 2012 A1
20120026408 Lee et al. Feb 2012 A1
20120038593 Rönkäet al. Feb 2012 A1
20120050336 Nave et al. Mar 2012 A1
20120056807 Chapman et al. Mar 2012 A1
20120062474 Weishaupt et al. Mar 2012 A1
20120062492 Katoh Mar 2012 A1
20120068973 Christiansson et al. Mar 2012 A1
20120086673 Chien et al. Apr 2012 A1
20120089348 Perlin et al. Apr 2012 A1
20120110447 Chen May 2012 A1
20120131490 Lin et al. May 2012 A1
20120141001 Zhang et al. Jun 2012 A1
20120146930 Lee Jun 2012 A1
20120153134 Bergström et al. Jun 2012 A1
20120154338 Bergström et al. Jun 2012 A1
20120162142 Christiansson et al. Jun 2012 A1
20120162144 Fåhraeus et al. Jun 2012 A1
20120169672 Christiansson Jul 2012 A1
20120170056 Jakobsen et al. Jul 2012 A1
20120181419 Momtahan Jul 2012 A1
20120182266 Han Jul 2012 A1
20120188205 Jansson et al. Jul 2012 A1
20120188206 Sparf et al. Jul 2012 A1
20120191993 Drader et al. Jul 2012 A1
20120200532 Powell et al. Aug 2012 A1
20120200538 Christiansson et al. Aug 2012 A1
20120212441 Christiansson et al. Aug 2012 A1
20120212457 Drumm Aug 2012 A1
20120217882 Wong et al. Aug 2012 A1
20120218200 Glazer et al. Aug 2012 A1
20120218229 Drumm Aug 2012 A1
20120223916 Kukulj Sep 2012 A1
20120242622 Tseng et al. Sep 2012 A1
20120249478 Chang et al. Oct 2012 A1
20120256882 Christiansson et al. Oct 2012 A1
20120257004 Smith Oct 2012 A1
20120268403 Christiansson Oct 2012 A1
20120268427 Slobodin Oct 2012 A1
20120274559 Mathai et al. Nov 2012 A1
20120305755 Hong et al. Dec 2012 A1
20120313865 Pearce Dec 2012 A1
20130021300 Wassvik Jan 2013 A1
20130021302 Drumm Jan 2013 A1
20130027404 Sarnoff Jan 2013 A1
20130044073 Christiansson et al. Feb 2013 A1
20130055080 Komer et al. Feb 2013 A1
20130055143 Martin et al. Feb 2013 A1
20130076697 Goertz et al. Mar 2013 A1
20130082980 Gruhlke et al. Apr 2013 A1
20130093838 Tan Apr 2013 A1
20130106709 Simmons May 2013 A1
20130107569 Suganuma May 2013 A1
20130113715 Grant et al. May 2013 A1
20130120320 Liu et al. May 2013 A1
20130125016 Pallakoff et al. May 2013 A1
20130127790 Wassvik May 2013 A1
20130135258 King et al. May 2013 A1
20130135259 King et al. May 2013 A1
20130136304 Anabuki et al. May 2013 A1
20130141388 Ludwig et al. Jun 2013 A1
20130141395 Holmgren et al. Jun 2013 A1
20130154983 Christiansson et al. Jun 2013 A1
20130155027 Holmgren et al. Jun 2013 A1
20130155655 Lee et al. Jun 2013 A1
20130155723 Coleman Jun 2013 A1
20130158504 Ruchti et al. Jun 2013 A1
20130181896 Gruhlke et al. Jul 2013 A1
20130181953 Hinckley et al. Jul 2013 A1
20130187891 Eriksson et al. Jul 2013 A1
20130201142 Suarez Rovere Aug 2013 A1
20130222344 Lu et al. Aug 2013 A1
20130222346 Chen et al. Aug 2013 A1
20130234991 Sparf Sep 2013 A1
20130241886 Eriksson et al. Sep 2013 A1
20130241887 Sharma Sep 2013 A1
20130249833 Christiansson et al. Sep 2013 A1
20130257810 Niu et al. Oct 2013 A1
20130269867 Trott Oct 2013 A1
20130275082 Follmer et al. Oct 2013 A1
20130279190 Huang Oct 2013 A1
20130285920 Colley Oct 2013 A1
20130285968 Christiansson et al. Oct 2013 A1
20130300714 Goh et al. Nov 2013 A1
20130300716 Craven-Bartle et al. Nov 2013 A1
20130307795 Suarez Rovere Nov 2013 A1
20130307796 Liu et al. Nov 2013 A1
20130321740 An et al. Dec 2013 A1
20130342490 Wallander et al. Dec 2013 A1
20140002400 Christiansson et al. Jan 2014 A1
20140015803 Drumm Jan 2014 A1
20140028575 Parivar et al. Jan 2014 A1
20140028604 Morinaga et al. Jan 2014 A1
20140028629 Drumm et al. Jan 2014 A1
20140036203 Guillou et al. Feb 2014 A1
20140055421 Christiansson et al. Feb 2014 A1
20140063853 Nichol et al. Mar 2014 A1
20140071653 Thompson et al. Mar 2014 A1
20140085241 Christiansson et al. Mar 2014 A1
20140092052 Grunthaner et al. Apr 2014 A1
20140098032 Ng et al. Apr 2014 A1
20140098058 Baharav et al. Apr 2014 A1
20140109219 Rohrweck et al. Apr 2014 A1
20140111478 Lin et al. Apr 2014 A1
20140111480 Kim et al. Apr 2014 A1
20140125633 Fåhraeus et al. May 2014 A1
20140139467 Ghosh et al. May 2014 A1
20140152624 Piot et al. Jun 2014 A1
20140160762 Dudik et al. Jun 2014 A1
20140192023 Hoffman Jul 2014 A1
20140210793 Eriksson et al. Jul 2014 A1
20140218467 You Aug 2014 A1
20140226084 Utukuri et al. Aug 2014 A1
20140232669 Ohlsson et al. Aug 2014 A1
20140237401 Krus et al. Aug 2014 A1
20140237408 Ohlsson et al. Aug 2014 A1
20140237422 Ohlsson et al. Aug 2014 A1
20140253520 Cueto et al. Sep 2014 A1
20140253831 Craven-Bartle Sep 2014 A1
20140259029 Choi et al. Sep 2014 A1
20140267124 Christiansson et al. Sep 2014 A1
20140292701 Christiansson et al. Oct 2014 A1
20140300572 Ohlsson et al. Oct 2014 A1
20140320459 Pettersson et al. Oct 2014 A1
20140320460 Johansson et al. Oct 2014 A1
20140324953 Seo et al. Oct 2014 A1
20140347325 Wallander et al. Nov 2014 A1
20140362046 Yoshida Dec 2014 A1
20140368471 Christiansson et al. Dec 2014 A1
20140375607 Christiansson et al. Dec 2014 A1
20140380193 Coplen et al. Dec 2014 A1
20150002386 Mankowski et al. Jan 2015 A1
20150009687 Lin Jan 2015 A1
20150015497 Leigh Jan 2015 A1
20150026630 Bullock Jan 2015 A1
20150035774 Christiansson et al. Feb 2015 A1
20150035803 Wassvik et al. Feb 2015 A1
20150053850 Uvnäs Feb 2015 A1
20150054759 Christiansson et al. Feb 2015 A1
20150062085 Lu et al. Mar 2015 A1
20150070327 Hsieh et al. Mar 2015 A1
20150083891 Wallander Mar 2015 A1
20150103013 Huang Apr 2015 A9
20150121691 Wang May 2015 A1
20150130769 Björklund May 2015 A1
20150131010 Sugiyama May 2015 A1
20150138105 Christiansson et al. May 2015 A1
20150138158 Wallander et al. May 2015 A1
20150138161 Wassvik May 2015 A1
20150154291 Shepherd et al. Jun 2015 A1
20150199071 Hou Jul 2015 A1
20150205441 Bergström et al. Jul 2015 A1
20150215450 Seo et al. Jul 2015 A1
20150242055 Wallander Aug 2015 A1
20150256658 Shin et al. Sep 2015 A1
20150261323 Cui et al. Sep 2015 A1
20150271481 Guthrie et al. Sep 2015 A1
20150286698 Gagnier et al. Oct 2015 A1
20150317036 Johansson et al. Nov 2015 A1
20150324028 Wassvik et al. Nov 2015 A1
20150331544 Bergström et al. Nov 2015 A1
20150331545 Wassvik et al. Nov 2015 A1
20150331546 Craven-Bartle et al. Nov 2015 A1
20150331547 Wassvik et al. Nov 2015 A1
20150332655 Krus et al. Nov 2015 A1
20150334138 Conklin et al. Nov 2015 A1
20150339000 Lee et al. Nov 2015 A1
20150346856 Wassvik Dec 2015 A1
20150346911 Christiansson Dec 2015 A1
20150363042 Krus et al. Dec 2015 A1
20150373864 Jung Dec 2015 A1
20160004898 Holz Jan 2016 A1
20160026297 Shinkai et al. Jan 2016 A1
20160026337 Wassvik et al. Jan 2016 A1
20160034099 Christiansson et al. Feb 2016 A1
20160041629 Rao et al. Feb 2016 A1
20160050746 Wassvik et al. Feb 2016 A1
20160062549 Drumm et al. Mar 2016 A1
20160070415 Christiansson et al. Mar 2016 A1
20160070416 Wassvik Mar 2016 A1
20160092021 Tu et al. Mar 2016 A1
20160103026 Povazay et al. Apr 2016 A1
20160117019 Michiaki Apr 2016 A1
20160124546 Chen et al. May 2016 A1
20160124551 Christiansson et al. May 2016 A1
20160077616 Durojaiye et al. Jun 2016 A1
20160154532 Campbell Jun 2016 A1
20160154533 Eriksson et al. Jun 2016 A1
20160179261 Drumm Jun 2016 A1
20160202841 Christiansson et al. Jul 2016 A1
20160209886 Suh et al. Jul 2016 A1
20160216844 Bergström Jul 2016 A1
20160224144 Klinghult et al. Aug 2016 A1
20160255713 Kim et al. Sep 2016 A1
20160295711 Ryu et al. Oct 2016 A1
20160299583 Watanabe Oct 2016 A1
20160306501 Drumm et al. Oct 2016 A1
20160328090 Klinghult Nov 2016 A1
20160328091 Wassvik et al. Nov 2016 A1
20160334942 Wassvik Nov 2016 A1
20160342282 Wassvik Nov 2016 A1
20160357348 Wallander Dec 2016 A1
20170010688 Fahraeus et al. Jan 2017 A1
20170031516 Sugiyama et al. Feb 2017 A1
20170075484 Kali et al. Mar 2017 A1
20170090090 Craven-Bartle et al. Mar 2017 A1
20170102827 Christiansson et al. Apr 2017 A1
20170115235 Ohlsson et al. Apr 2017 A1
20170115823 Huang et al. Apr 2017 A1
20170123257 Zhao May 2017 A1
20170131846 Edzer et al. May 2017 A1
20170139541 Christiansson et al. May 2017 A1
20170160871 Drumm Jun 2017 A1
20170177163 Wallander et al. Jun 2017 A1
20170185186 Liu Jun 2017 A1
20170185230 Wallander et al. Jun 2017 A1
20170185269 Antilla et al. Jun 2017 A1
20170192493 Ofek et al. Jul 2017 A1
20170220204 Huang et al. Aug 2017 A1
20170235537 Liu et al. Aug 2017 A1
20170249030 Park et al. Aug 2017 A1
20170264865 Huangfu Sep 2017 A1
20170285789 Barel Oct 2017 A1
20170344185 Ohlsson et al. Nov 2017 A1
20180031753 Craven-Bartle et al. Feb 2018 A1
20180107373 Cheng Apr 2018 A1
20180129354 Christiansson et al. May 2018 A1
20180136788 He et al. May 2018 A1
20180149792 Lee et al. May 2018 A1
20180205989 Srinivasan et al. Jul 2018 A1
20180225006 Wall Aug 2018 A1
20180253187 Christiansson et al. Sep 2018 A1
20180267672 Wassvik et al. Sep 2018 A1
20180275788 Christiansson et al. Sep 2018 A1
20180275830 Christiansson et al. Sep 2018 A1
20180275831 Christiansson et al. Sep 2018 A1
20180275836 Hermans et al. Sep 2018 A1
20180314206 Lee et al. Nov 2018 A1
20190004668 Jeong et al. Jan 2019 A1
20190025984 Weilbacher et al. Jan 2019 A1
20190050074 Kocovski Feb 2019 A1
20190065030 Kang et al. Feb 2019 A1
20190107923 Drumm Apr 2019 A1
20190146630 Chen et al. May 2019 A1
20190155495 Klein et al. May 2019 A1
20190196659 Skagmo et al. Jun 2019 A1
20190227670 O'Cleirigh et al. Jul 2019 A1
20190235701 Han et al. Aug 2019 A1
20190250755 Liu et al. Aug 2019 A1
20190258353 Drumm et al. Aug 2019 A1
20190317640 Christiansson et al. Oct 2019 A1
20190324570 Kolundzjia et al. Oct 2019 A1
20190377431 Drumm Dec 2019 A1
20190377435 Piot et al. Dec 2019 A1
20200012408 Drumm et al. Jan 2020 A1
20200073509 Shih et al. Mar 2020 A1
20200098147 Ha et al. Mar 2020 A1
20200125189 Kim et al. Apr 2020 A1
20200159382 Drumm May 2020 A1
20200167033 Kim et al. May 2020 A1
20200249777 Hou et al. Aug 2020 A1
20200310592 Bergstrom et al. Oct 2020 A1
20200310621 Piot et al. Oct 2020 A1
20200341587 Drumm Oct 2020 A1
20200348473 Drumm Nov 2020 A1
20200387237 Drumm Dec 2020 A1
20210255662 Svensson et al. Aug 2021 A1
20220221955 Bergstrom et al. Jul 2022 A1
20220413652 Andersson et al. Dec 2022 A1
20230057020 Wassvik Feb 2023 A1
20230068643 Bergstrom et al. Mar 2023 A1
20230080260 Bergstrom et al. Mar 2023 A1
20230082401 Andreasson et al. Mar 2023 A1
Foreign Referenced Citations (194)
Number Date Country
2008 280 952 Mar 2009 AU
2014201966 Apr 2014 AU
201233592 May 2009 CN
101174191 Jun 2009 CN
101644854 Feb 2010 CN
201437963 Apr 2010 CN
201465071 May 2010 CN
101882034 Nov 2010 CN
102117155 Jul 2011 CN
101019071 Jun 2012 CN
101206550 Jun 2012 CN
102929449 Feb 2013 CN
202887145 Apr 2013 CN
103123556 May 2013 CN
203189466 Sep 2013 CN
203224848 Oct 2013 CN
203453994 Feb 2014 CN
101075168 Apr 2014 CN
102414646 Apr 2014 CN
203720812 Jul 2014 CN
203786707 Aug 2014 CN
203786708 Aug 2014 CN
203825586 Sep 2014 CN
204288179 Apr 2015 CN
104808843 Jul 2015 CN
205015574 Feb 2016 CN
205384833 Jul 2016 CN
104391611 Sep 2017 CN
3511330 May 1988 DE
68902419 Mar 1993 DE
69000920 Jun 1993 DE
19809934 Sep 1999 DE
10026201 Dec 2000 DE
10025175 Dec 2001 DE
102009003990 Jul 2010 DE
102010000473 Aug 2010 DE
0845812 Jun 1998 EP
0600576 Oct 1998 EP
0931731 Jul 1999 EP
1798630 Jun 2007 EP
0897161 Oct 2007 EP
2088501 Aug 2009 EP
1512989 Sep 2009 EP
2077490 Jan 2010 EP
1126236 Dec 2010 EP
2314203 Apr 2011 EP
2325735 May 2011 EP
2339437 Oct 2011 EP
2442180 Apr 2012 EP
2466429 Jun 2012 EP
2479642 Jul 2012 EP
1457870 Aug 2012 EP
2565770 Mar 2013 EP
2765622 Aug 2014 EP
2778849 Sep 2014 EP
2840470 Feb 2015 EP
2515216 Mar 2016 EP
3002666 Apr 2016 EP
3535640 Sep 2019 EP
2172828 Oct 1973 FR
2617619 Jan 1990 FR
2614711 Mar 1992 FR
2617620 Sep 1992 FR
2676275 Nov 1992 FR
1380144 Jan 1975 GB
2131544 Mar 1986 GB
2204126 Nov 1988 GB
S62159213 Jul 1987 JP
H05190066 Jul 1993 JP
2000506655 May 2000 JP
2000172438 Jun 2000 JP
2000259334 Sep 2000 JP
2000293311 Oct 2000 JP
2003330603 Nov 2003 JP
2005004278 Jan 2005 JP
2008506173 Feb 2008 JP
2011530124 Dec 2011 JP
2016192688 Nov 2016 JP
2015158831 Feb 2018 JP
100359400 Jul 2001 KR
100940435 Feb 2010 KR
101081586 Nov 2011 KR
20150125374 Nov 2015 KR
10-2016-0075643 Jun 2016 KR
M517370 Feb 2016 TW
WO 1984003186 Aug 1984 WO
WO 9527919 Oct 1995 WO
WO 1999046602 Sep 1999 WO
WO 01127867 Apr 2001 WO
WO 0184251 Nov 2001 WO
WO 0235460 May 2002 WO
WO 02077915 Oct 2002 WO
WO 02095668 Nov 2002 WO
WO 03076870 Sep 2003 WO
WO 2004032210 Apr 2004 WO
WO 2004081502 Sep 2004 WO
WO 2004081956 Sep 2004 WO
WO 2005026938 Mar 2005 WO
WO 2005029172 Mar 2005 WO
WO 2005029395 Mar 2005 WO
WO 2005125011 Dec 2005 WO
WO 2006081633 Aug 2006 WO
WO 2006095320 Sep 2006 WO
WO 2006124551 Nov 2006 WO
WO 2007003196 Jan 2007 WO
WO 2007047685 Apr 2007 WO
WO-2007047685 Apr 2007 WO
WO 2007058924 May 2007 WO
WO 2007112742 Oct 2007 WO
WO 2008004103 Jan 2008 WO
WO 2008007276 Jan 2008 WO
WO 2008017077 Feb 2008 WO
WO 2008034184 Mar 2008 WO
WO 2008039006 Apr 2008 WO
WO 2008044024 Apr 2008 WO
WO 2008068607 Jun 2008 WO
WO 2006124551 Jul 2008 WO
WO 2008017077 Feb 2009 WO
WO 2009029764 Mar 2009 WO
WO 2009048365 Apr 2009 WO
WO 2009077962 Jun 2009 WO
WO 2009102681 Aug 2009 WO
WO 2009137355 Nov 2009 WO
WO 2010006882 Jan 2010 WO
WO 2010006883 Jan 2010 WO
WO 2010006884 Jan 2010 WO
WO 2010006885 Jan 2010 WO
WO 2010006886 Jan 2010 WO
WO 2010015408 Feb 2010 WO
WO 2010046539 Apr 2010 WO
WO 2010056177 May 2010 WO
WO 2010064983 Jun 2010 WO
WO 2010081702 Jul 2010 WO
WO 2010112404 Oct 2010 WO
WO 2010123809 Oct 2010 WO
WO 2010134865 Nov 2010 WO
WO 2011028169 Mar 2011 WO
WO 2011028170 Mar 2011 WO
WO 2011049511 Apr 2011 WO
WO 2011049512 Apr 2011 WO
WO 2011049513 Apr 2011 WO
WO 2011057572 May 2011 WO
WO 2011078769 Jun 2011 WO
WO 2011082477 Jul 2011 WO
WO 2011139213 Nov 2011 WO
WO 2012002894 Jan 2012 WO
WO 2012010078 Jan 2012 WO
WO 2012018176 Feb 2012 WO
WO 2012050510 Apr 2012 WO
WO 2012082055 Jun 2012 WO
WO 2012105893 Aug 2012 WO
WO 2012121652 Sep 2012 WO
WO 2012158105 Nov 2012 WO
WO 2012171181 Dec 2012 WO
WO 2012172302 Dec 2012 WO
WO 2012176801 Dec 2012 WO
WO 2013036192 Mar 2013 WO
WO 2013048312 Apr 2013 WO
WO 2013055282 Apr 2013 WO
WO 2013062471 May 2013 WO
WO 2013081818 Jun 2013 WO
WO 2013089622 Jun 2013 WO
WO 2013115710 Aug 2013 WO
WO 2013133756 Sep 2013 WO
WO 2013133757 Sep 2013 WO
WO 2013138003 Sep 2013 WO
WO 2013159472 Oct 2013 WO
WO 2013176613 Nov 2013 WO
WO 2013176614 Nov 2013 WO
WO 2013176615 Nov 2013 WO
WO 2014044181 Mar 2014 WO
WO 2014055809 Apr 2014 WO
WO 2014065601 May 2014 WO
WO 2014086084 Jun 2014 WO
WO 2014098742 Jun 2014 WO
WO 2014098744 Jun 2014 WO
WO 2014104967 Jul 2014 WO
WO 2014130515 Aug 2014 WO
WO 2014131221 Sep 2014 WO
WO 2015123322 Aug 2015 WO
WO 2015175586 Nov 2015 WO
WO 2016130074 Aug 2016 WO
WO 2017099657 Jun 2017 WO
WO 2017138863 Aug 2017 WO
WO 2018096430 May 2018 WO
WO 2018106172 Jun 2018 WO
WO 2018106176 Jun 2018 WO
WO 2018141948 Aug 2018 WO
WO 2018182476 Oct 2018 WO
WO 2019045629 Mar 2019 WO
WO 2019156609 Aug 2019 WO
WO 2019172826 Sep 2019 WO
WO 2019172827 Sep 2019 WO
WO 2020022096 Jan 2020 WO
Non-Patent Literature Citations (80)
Entry
International Search Report for International App. No. PCT/SE2020/050043, dated Feb. 24, 2020, in 3 pages.
Kar-Han Tan, Robinson I N, Culbertson B, Apostolopoulos J, ‘ConnectBoard: Enable Genuine Eye Contact and Accurate Gaze in Remote Collaboration’, In: IEEE Transaction on Multimedia, Jun. 2011, vol. 13, No. 3, ISSN: 1520-9210.
Ahn, Y., et al., “A slim and wide multi-touch tabletop interface and its applications,” BigComp2014, IEEE, 2014, in 6 pages.
Chou, N., et al., “Generalized pseudo-polar Fourier grids and applications in regfersting optical coherence tomography images,” 43rd Asilomar Conference on Signals, Systems and Computers, Nov. 2009, in 5 pages.
Fihn, M., “Touch Panel—Special Edition,” Veritas et Visus, Nov. 2011, in 1 page.
Fourmont, K., “Non-Equispaced Fast Fourier Transforms with Applications to Tomography,” Journal of Fourier Analysis and Applications, vol. 9, Issue 5, 2003, in 20 pages.
Iizuka, K., “Boundaries, Near-Field Optics, and Near-Field Imaging,” Elements of Photonics, vol. 1: In Free Space and Special Media, Wiley & Sons, 2002, in 57 pages.
International Search Report for International App. No. PCT/SE2017/050102, dated Apr. 5, 2017, in 4 pages.
International Search Report in International Application No. PCT/SE2020/051172 dated Feb. 4, 2021 in 5 pages.
Johnson, M., “Enhanced Optical Touch Input Panel”, IBM Technical Disclosure Bulletin, 1985, in 3 pages.
Kak, et al., “Principles of Computerized Tomographic Imaging”, Institute of Electrical Engineers, Inc., 1999, in 333 pages.
The Laser Wall, MIT, 1997, http://web.media.mit.edu/″joep/SpectrumWeb/captions/Laser.html.
Liu, J., et al. “Multiple touch points identifying method, involves starting touch screen, driving specific emission tube, and computing and transmitting coordinate of touch points to computer system by direct lines through interface of touch screen,” 2007, in 25 pages.
Machine translation of KR10-2016-0075643 (Year: 2017).
Natterer, F., “The Mathematics of Computerized Tomography”, Society for Industrial and Applied Mathematics, 2001, in 240 pages.
Natterer, F., et al. “Fourier Reconstruction,” Mathematical Methods in Image Reconstruction, Society for Industrial and Applied Mathematics, 2001, in 12 pages.
Paradiso, J.A., “Several Sensor Approaches that Retrofit Large Surfaces for Interactivity,” ACM Ubicomp 2002 Workshop on Collaboration with Interactive Walls and Tables, 2002, in 8 pages.
Tedaldi, M., et al. “Refractive index mapping of layered samples using optical coherence refractometry,” Proceedings of SPIE, vol. 7171, 2009, in 8 pages.
Supplementary European Search Report for European App. No. EP 16759213, dated Oct. 4, 2018, in 9 pages.
Extended European Search Report for European App. No. 16743795.3, dated Sep. 11, 2018, in 5 pages.
International Search Report for International App. No. PCT/SE2017/051224, dated Feb. 23, 2018, in 5 pages.
International Search Report for International App. No. PCT/IB2017/057201, dated Mar. 6, 2018, in 4 pages.
Extended European Search Report in European Application No. 19165019.1, dated Jul. 18, 2019 in 8 pages.
International Preliminary Report on Patentability received in International Application No. PCT/SE2017/051233, dated Jun. 11, 2019, in 6 pages.
International Search Report for International App. No. PCT/SE2018/050070, dated Apr. 25, 2018, in 4 pages.
International Search Report / Written Opinion received in International Application No. PCT/SE2021/051151 dated Jan. 26, 2022, in 13 pages.
Extended European Search Report in European Application No. 17750516.1, dated Jul. 16, 2019 in 5 pages.
Extended European Search Report in European Application No. 16873465.5, date Jun. 25, 2019 in 9 pages.
Report on the Filing or Determination of an Action Regarding a Patent or Trademark. For U.S. Pat. No. 10,282,035, U.S. District of Delaware, dated Dec. 10, 2019, in 1 page.
Civil Cover Sheet Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., dated Dec. 10, 2019, in 1 page.
Complaint for Patent Infringement, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 19-2246, dated Dec. 10, 2019, in 83 pages.
Executed Summons in a Civil Action to Promethean Inc., C.A. No. 19-2246, dated Dec. 10, 2019 in 2 pages.
Summons in a Civil Action to Promethean Inc., C.A. No. 19-2246, dated Dec. 10, 2019 in 2 pages.
Summons in a Civil Action to Promethean Ltd., C.A. No. 19-2246, dated Dec. 10, 2019 in 2 pages.
Defendants' Answer to Second Amended Complaint and Defendant Promethean Inc.'s Counterclaims Against FlatFrog Laboratories Ab., C.A. No. 19-2246, dated May 22, 2020, in 29 pages.
Extended European Search Report for European App. No. 18772370.5, dated Dec. 9, 2020, in 8 pages.
Extended European Search Report for European App. No. 18772178.2, dated Dec. 10, 2020, in 8 pages.
Extended European Search Report for European App. No. 18774232.5, dated Dec. 21, 2020, in 9 pages.
Defendants' Initial Invalidity Contentions, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Notice of Service, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 2 pages.
Exhibit 1: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on Prior Public Use and/or Commercial Offer for Sale of Defendant Promethean Inc.'s ActivPanel 4.5 Product, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 2: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Patent No. U.S. Pre-Grant Pub. No. 2019/0235701 to Han et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 3A: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 4,751,379 to Sasaki et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 3B: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 4,751,379 to Sasaki et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 3C: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 4,751,379 to Sasaki et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 4A: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0004668 to Jeong et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 4B: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0004668 to Jeong et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 4C: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0004668 to Jeong et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 5A: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,983,626 to Cao et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 5B: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,983,626 to Cao et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 5C: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,983,626 to Cao et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 6A: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0025984 to Weilbacher et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 6B: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0025984 to Weilbacher et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 6C: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pre-Grant Pub. No. 2019/0025984 to Weilbacher et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 7A: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,207,800 to Eriksson et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 7B: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,207,800 to Eriksson et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 7C: Invalidity Claim Chart Against U.S. Pat. No. 10,775,935 Based on U.S. Pat. No. 9,207,800 to Eriksson et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 8: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on Prior Public Use and/or Commercial Offer for Sale of Defendant Promethean Inc.'s ActivPanel 4.5 Product, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 9: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on Chinese Utility ModelNo. CN 203786707 U to Chen et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 10: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on International App. Pub.No. WO2014131221 to Chen et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 11: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on Chinese Pub. App. No.104391611 A to Hu et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 12: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on Chinese Utility ModelNo. 203786708 U to Cao, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Exhibit 13: Invalidity Claim Chart Against U.S. Pat. No. 10,739,916 Based on Chinese Utility ModelNo. 204288179 U to Mo et al., Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Apr. 23, 2021, in 26 pages.
Defendants Promethean Ltd. and Promethean Inc.'s Preliminary Proposed Claim Constructions, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated May 24, 2021, in 8 pages.
Defendants' Sur-Reply Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 13, 2021, in 24 pages.
ASTM International, “Standard Specification for Heat-Treated Flat Glass-Kind HS, Kind FT Coated and Uncoated Glass,” Designation: C 1048-04, in 7 pages.
British Standard, “Glass in building—Thermally toughened soda lime silicate safety glass,” EN 12150-1:2000, ISBN 0 580 36171 3, Aug. 15, 2000, in 28 pages.
Joint Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 20, 2021, in 92 pages.
Joint Appendix of Exhibits to Joint Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 20, 2021, in 383 pages, (uploaded in 4 parts, part 1 of 4).
Joint Appendix of Exhibits to Joint Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 20, 2021, in 383 pages, (uploaded in 4 parts, part 2 of 4).
Joint Appendix of Exhibits to Joint Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 20, 2021, in 383 pages, (uploaded in 4 parts, part 3 of 4).
Joint Appendix of Exhibits to Joint Claim Construction Brief, Flatfrog Laboratories Ab v. Promethean Ltd. and Promethean Inc., C.A. No. 1:19-cv-02246-MN, dated Sep. 20, 2021, in 383 pages, (uploaded in 4 parts, part 4 of 4).
International Search Report in PCT/SE2019/050189 dated May 29, 2019 in 4 pages.
International Search Report for International App. No. PCT/SE2019/050953, dated Nov. 26, 2019, in 4 pages.
International Search Report for International App. No. PCT/SE2020/050504, dated Apr. 9, 2020, in 4 pages.
International Search Report in App. No. PCT/SE2020/051117 dated Feb. 5, 2021 in 2 pages.
International Search Report in International Application No. PCT/SE2021/050040 dated May 10, 2021 in 3 pages.
International Search Report in International App. No. PCT/SE2021/050086 dated Feb. 26, 2021 in 5 pages.
International Search Report in International Application No. PCT/SE2021/050095 dated Jun. 2, 2021 in 6 pages.
International Search Report / Written Opinion of the International Searching Authority for PCT/SE2021/051018, dated Feb. 1, 2022, in 10 pages.
Related Publications (1)
Number Date Country
20220109809 A1 Apr 2022 US