This disclosure generally relates to privacy control of a directional display apparatus.
Display devices are ubiquitous. There are many situations where viewers of apparatuses including display devices are concerned about privacy issues when using the display device in public or unsecure environments, for example when working on confidential or sensitive documents. Examples of display devices where such concerns exist include computer apparatuses, such as desktop computers, laptop computers and tablets; mobile communication devices, such as smartphones; and display devices in static installations, such as kiosks and ATMs. In such circumstances, the primary viewer is required to be vigilant of other people in their surroundings and to take action to obscure or turn off the display device when unwanted, secondary viewers are present. It would be desirable for the display device to have a privacy control function which assists the primary user in preventing unwanted viewing of the displayed image.
There exist privacy functions which may determine that secondary viewers are viewing a display device, and in response may blur the displayed image. However, typically the display device is then blurred for everyone, including the primary viewer. This is less than desirable and it may be more constructive for the primary viewer to be able to continue viewing.
Display devices which are directional are known. Examples of a type of directional display device using a directional backlight are disclosed in U.S. Patent Publ. No. 2012/0127573, and U.S. Patent Publ. No. 2014/0240828. Directional display devices of this and other types may direct the displayed image into a viewing window, which may have a finite width in a viewing plane, being typically much narrower than the viewing width of a conventional display device. Such a directional display device may be operated in a mode in which the displayed image is directed into a viewing window of relatively narrow width in order to provide a privacy function. The privacy function may be used to provide the displayed image with reduced or negligible visibility to a secondary viewer.
An aspect of the present disclosure is concerned with the functionality of a directional display device used to provide a privacy function.
According to a first aspect of the present disclosure, there is provided a privacy control method of controlling a directional display device that is capable of directing a displayed image into a viewing window that is adjustable, the method comprising: directing the displayed image into a viewing window; detecting the presence of one or more secondary viewers in addition to a primary viewer; in the event of detecting the presence of the one or more secondary viewers, deciding whether the one or more secondary viewers is permitted to view the displayed image; and adjusting the viewing window in response to detecting the presence of one or more secondary viewers and deciding that the one or more secondary viewers is not permitted to view the displayed image.
In this aspect of the disclosure, advantage is taken of a directional display device that is capable of directing a displayed image into a viewing window that is adjustable, for example by having variable width and/or variable position. In the privacy control method, while directing the displayed image into a viewing window, there is detected the presence of one or more secondary viewers in addition to a primary viewer. This detection may be used in the control of the viewing window. In particular the viewing window may be adjusted when it is decided that the secondary viewer is not permitted to view the displayed image. Such adjustment may reduce the visibility of the viewing window to the secondary viewer. In a first example, the adjustment may decrease the width of the viewing window. In a second example, the adjustment may shift the position of the viewing window away from the secondary viewer. By way of example, the directional display device may be controlled between (a) a normal mode in which no adjustment is made, which in the first example may cause the viewing window to have a maximum width, and (b) a privacy mode in which the viewing window is adjusted, which in the first example may cause the viewing window to have a width that is sufficient only for viewing by a primary viewer.
The detection of the presence of one or more secondary viewers in addition to the primary viewer may be performed in various ways, some non-limitative examples being as follows.
In one example, the method may further comprise capturing a viewing region image, in which case the presence of one or more secondary viewers may comprise analyzing the viewing region image.
In another example, the presence of one or more secondary viewers may comprise detecting an electromagnetic tag carried by the one or more secondary viewers.
The decision of whether the one or more secondary viewers is permitted to view the displayed image may be taken in dependence on one or a combination of a variety of factors, some non-limitative examples being as follows. This provides powerful control of the privacy function.
In one example of such a factor, the decision may be taken in dependence on a comparison of image information derived from a captured viewing region image with a database that associates image information of viewers with viewer permission information.
In another example of such a factor, the decision may be taken in dependence on a comparison of the identity of the one or more secondary viewers determined from an electromagnetic tag with a database that associates viewers with viewer permission information.
In another example of such a factor, the decision may be taken in dependence on information about the location of the display device. Such information may comprise the geographical location of the display device based on the output of a location sensor and/or may comprise information derived from a viewing region image.
Further according to the first aspect of the present disclosure, there may be provided a directional display apparatus capable of implementing a similar privacy control method.
According to a second aspect of the present disclosure, there is provided a method of controlling a directional display device that is capable of directing a displayed image into a viewing window of variable width, the method comprising: directing a displayed image into a viewing window, detecting relative movement between a viewer and the display device; and increasing the width of the viewing window in response to detecting said relative movement.
In this aspect of the disclosure, advantage is taken of a directional display device that is capable of directing a displayed image into a viewing window of variable width. In the privacy control method, while directing the displayed image into a viewing window, there is detected the relative movement between a viewer and the display device. The width of the viewing window may be increased in response to detecting said relative movement.
When the viewer is moving, as a viewing window typically has some spatial non-uniformity in brightness, there is a risk of the viewer perceiving fluctuations in the brightness of the displayed image as they move between different portions of the viewing window. This might be perceived in some circumstances as flicker. However, by increasing the width of the viewing window when relative movement of the viewer is detected, such the perception of such brightness fluctuations may be reduced.
The relative movement which is detected may be, for example, linear motion of the viewer relative to the display device laterally of the viewing window, and or vibratory movement of the display device relative to the viewer.
The relative movement may be detected using a motion sensor mounted in the display device or, where an image of the viewing region is captured, may be detected by analyzing the viewing region image to determine the position of the viewer.
Further according to the first aspect of the present disclosure, there may be provided a directional display apparatus capable of implementing a similar privacy control method.
The first and second aspects of the disclosure may be applied in combination. Similarly, the optional features of the first and second aspects may be implemented together in any combination.
Non-limitative embodiments are illustrated by way of example in the accompanying figures, in which like reference numbers indicate similar parts, and in which:
The directional display device 1 is an example of a device that is capable of directing a displayed image into a viewing window that is adjustable, in this example by having variable position and width. In this example, the directional display device 1 is a type disclosed in U.S. Patent Publ. No. 2012/0127573, and U.S. Patent Publ. No. 2014/0240828, which are herein incorporated by reference in their entireties. A general description of the directional display device 1 is given below, but reference is made to U.S. Patent Publ. No. 2012/0127573, and U.S. Patent Publ. No. 2014/0240828 for further details of the construction and operation that may be applied here.
The directional display device 1 includes a directional backlight 2 and a spatial light modulator 3.
The directional backlight 2 directs light into optical windows. In particular, the directional backlight includes an array of light sources 4 and a waveguide 5. The light sources 4 may be light emitting diodes (LEDs). The light sources 4 may alternatively be of other types, for example diode sources, semiconductor sources, laser sources, local field emission sources, organic emitter arrays, and so forth.
The waveguide 5 directs light from each light source 4 into a respective viewing window 7.
In general terms, a possible construction of the waveguide 5 is as follows. The waveguide 5 has first and second guide surfaces 10 and 11 and a reflective end 12 which may have positive optical power. Input light from the light sources 4 is guided through the waveguide 5 by the first and second guide surfaces 10 and 11 to the reflective end 10 where it is reflected and directed back through the waveguide 5.
The second guide surface 12 includes extraction features 13 extending in a lateral direction across the waveguide 5 facing the reflective end 10. The extraction features 13 are oriented to reflect light from the light sources, after reflection from the reflective end 10, through the first guide surface 11 as output light. The intermediate regions 14 of the second guide surface 12 intermediate the extraction features 13 guide light through the waveguide without extracting it. The second guide surface 12 may have a stepped shape that provides the extraction features 13 and intermediate regions 14.
The waveguide 5 may provide focusing of the output light in the lateral direction. The focusing may be achieved, at least in part, by the extraction features 13 having positive optical power in the lateral direction. As a result, the output light derived from individual light sources 4 is directed into respective optical windows 7 in a viewing plane. The direction in which the optical windows 7 lie, relative to the directional display device 1 is dependent on the input position of the light source 4. Thus, the optical windows 7 produced by the array of light sources 4 are in output directions that are distributed in the lateral direction in dependence on the input positions of the respective light sources 4.
Further details of possible constructions of the waveguide 5 that causes it to direct light into optical windows are disclosed in more detail in U.S. Patent Publ. No. 2012/0127573 and U.S. Patent Publ. No. 2014/0240828.
The spatial light modulator 3 is capable of displaying an image. The spatial light modulator 3 is transmissive and modulates the light passing therethrough. The spatial light modulator 3 may be a liquid crystal display (LCD) but this is merely by way of example, and other spatial light modulators or displays may be used including LCOS, DLP devices, and so forth.
The spatial light modulator 3 extends across the first guide surface 11 of the waveguide 5 and so modulates the light that is output therethrough. Thus, the image displayed on the spatial light modulator 3 is directed into the optical windows 7. The extraction features 13 may be provided across an area of the waveguide 5 corresponding to the entire area of the spatial light modulator 3. In that case, the output light is output into the optical windows across the entire area of the spatial light modulator 2.
As described above, selective operation of the light sources 4 allows light to be directed into selected viewing windows. In principle a single light source 4 may be operated to direct light into a viewing window having a single optical window 7, but typically plural light sources 4 are operated at a time to direct light into a viewing window having a plural optical windows 7. By selectively varying the light sources 4 that are operated, the resultant viewing window may be provided with a variable position and width, and in this manner be adjustable.
The waveguide 5 including extraction features 13 may be replaced by a waveguide as of the type disclosed for example in U.S. Pat. No. 7,970,246, which is herein incorporated by reference in its entirety, and which may be referred to as a “wedge type directional backlight”.
The directional display device 1 forms part of a directional display apparatus 20 as shown in
The control circuit 22 may be implemented by a processor executing a suitable program, although optionally some functions of the control circuit 22 may be implemented by dedicated hardware.
The driver circuit 23 drives the light sources 4 by supplying a drive signal to each light source 4. In a conventional manner, the driver circuit 23 includes appropriate electronic circuitry to generate drive signals of sufficient power to drive the light sources 4.
The control circuit 22 controls the spatial light modulator 3 to display an image.
The control circuit 22 also controls the driver circuit 23 to drive the light sources 4. The light sources 4 are thus operated to output light with a variable luminous flux in accordance with the respective drive signal. The driver circuit 23 is supplied with a luminous flux profile from the control circuit 22, the luminous flux profile being a control signal that represents the desired luminous flux of each light source 4 across the array. Typically, the luminous flux profile represents the desired luminous fluxes in relative terms, not absolute terms. The driver circuit 23 generates drive signals for each light source 4 in accordance with the luminous flux profile. Thus, the luminous flux profile thus effectively represents the shape of the viewing window, including its width and position.
The control circuit 22 supplies different luminous flux profiles to the driver circuit 23 in different modes. Changing between modes may occur instantaneously, or occur over several frames of the image display in order to provide a more comfortable viewing performance for a viewer. Examples of luminous flux profiles supplied in different modes will now be described with reference to
As an alternative way of increasing the width of the viewing window compared to the first luminous flux profile, it is possible to increase the number of light sources 4 in the central group of light sources 4 that are operated with maximum luminous flux. In that case, the roll-off of the luminous flux of the light sources 4 on either side of the central group may be the same as in the first luminous flux profile, or may additionally be less steep than in the first luminous flux profile (as in the second luminous flux profile).
To make the position of the viewing window adjustable, the first and second luminous flux profiles may be changed to move the viewing window from the central position as shown in
The arrangement of the directional display device 1 described above is given as an example, but the directional display device 1 may alternatively be of any other type that is capable of directing a displayed image into a viewing window of variable width and/or of variable position.
The directional display apparatus 20 includes a camera system 24 arranged to capture an image of the viewing region, including the viewing plane. The camera system 24 may include a single camera or plural cameras.
Where the camera system 24 includes plural cameras, the following considerations may apply. The control circuit 22 may find the spatial relationship of the cameras in a calibration step using a reference image, or the spatial relationship may be specified a priori. The cameras face towards viewing region, their orientation depending on the field of view of each camera. One example to determine the orientation of a set of cameras is by maximizing the combined field of view of all cameras. The cameras may have different fields of view.
The cameras may have different sensing modalities. Examples for these are RGB data, infrared data, Time-of-Flight data, and Push-broom. Generally, a constraint is that the output of the camera system allows for angular localization of the observations with respect to the directional display device 1. One way to provide angular localization is to use the position and orientation of a camera with respect to the directional display device 1 and to back-project the observed location (image measurement), yielding a line of sight along the observation is positioned.
One way to estimate spatial localization in front of the directional display device 1 is to use the expected spatial extent of an object, and the extent of the measurement in any of the images of the camera system 24. Using the angular localization described above, the expected spatial extent can be related to the observed and a distance estimate. Another way to estimate spatial localization is to use the relative position of more than one camera to triangulate two image measurements.
The output of the camera system is a set of images (measurements allowing angular localization), the orientation and position of the cameras with respect to the directional display device 1, the sensing modalities and internal parameters of the sensor (focal length, optical axis) that may be used for angular localization.
The directional display apparatus 20 further includes a tag sensor 25 arranged to detect electromagnetic tags that allow identification of a viewer in the vicinity of the directional display device 1. The tag sensor 25 produces an output signal that may include an identifier from the detected tag that is unique.
The tag sensor 25 may be any sensor capable of providing such identification electromagnetically. One example is a tag sensor 25 using RFID (radio frequency identification tags) technology. In that case, RFID tags may be provided in an object such as a badge worn by viewers. Another embodiment is a tag sensor 25 using low power Bluetooth or the MAC address of a WiFi device, in which case the tag sensor 25 may sense for this data wirelessly, e.g. using an RFID sensor, a Bluetooth device, a WiFi device.
The directional display apparatus 20 further includes a motion sensor 26 that detects motion of the directional display device 1, and hence effectively detects relative motion between the viewer and the directional display device 1. The motion sensor 26 produces an output signal representing the detected motion.
The motion sensor 26 may be of any suitable type for detecting motion. By way of example, the motion sensor 26 may be a gyroscope, an IMU (inertial motion unit), or a differential GPS (global positioning system) device. The spatial resolution of the motion sensor 26 is typically less than 10 cm, more often less than 1 cm.
The directional display apparatus 20 further includes a location sensor 27 that determines the geographical location of the directional display device 1. The location sensor 27 produces an output signal representing the determined location.
The location sensor 27 may be of any suitable type, typically providing an absolute location, allowing localization of the direction display device 1 on Earth. In one example, the location sensor 27 may be a GPS sensor.
The directional display apparatus 20 further includes a proximity sensor 28 that detects the proximity of an object from the front of the directional display device 1, typically in the form of a scalar value indicating the distance of detected object. The proximity sensor 28 may be of any suitable type. By way of example, the proximity sensor may be an IR (infra-red) sensor, a sonar sensor or an ambient light sensor.
Each of the camera system 24, the tag sensor 25, the motion sensor 26, the location sensor 27, and the proximity sensor 28 supply output signals to the control circuit 22, continuously but at rates that may vary as between the different components, typically at above 1 Hz for at least the camera system 24 and the motion sensor 26, but perhaps at slower rates for the tag sensor 25, the location sensor 27, and the proximity sensor 28.
There will now be described some functional modules of the control circuit 22. Each functional module provides particular function and may be implemented in software executed by a processor.
The directional display apparatus 20 further includes a list 29 of system processes being performed by the control circuit 22, the list 29 being stored in a memory of the control circuit 22.
One functional module of the control circuit 22 is a pattern generator 30 which generates a luminous flux profile that is supplied to the driver circuit 23. The generated luminous flux profile is selected in accordance with a mode of operation determined as described further below.
Another functional module of the control circuit 22 is an identification module 31 arranged as shown in
The identification module 31 includes a viewer tracking module 32 arranged as shown in
The viewer tracking module 32 includes a viewer detection module 33 and a tracking module 34 that are each supplied with the output signal from the camera system 24, as well as data association module 35 that is supplied with outputs from the viewer detection module 33 and the tracking module 34.
The viewer detection module 33 analyzes the image captured by the camera system 24 to detect any viewers in the image. The viewer detection module 33 may use conventional detection algorithms for this purpose, typically detecting faces. In one example, the viewer detection module 33 may perform the detection using Haar feature cascades, for example as disclosed in Viola and Jones, “Rapid object detection using a boosted cascade of simple features”, CVPR 2001, which is herein incorporated by reference in its entirety.
The tracking module 34 analyzes the image captured by the camera system 24 to determine the position of viewers in the image. The tracking module 34 may use conventional detection algorithms for this purpose, typically tracking heads. In one example, the tracking module 34 may use the approach of Active Appearance Models to provide the position of the head of the viewer, for example as disclosed in Cootes, Edwards, and Taylor, “Active appearance models”, ECCV, 2:484-498, 1998, which is herein incorporated by reference in its entirety.
Tracking of a viewer may be stopped if the viewer leaves the field of view of the camera system 24, or becomes occluded for a given time span, or fails to be tracked for a given time span.
The data association module 35 associates the viewers detected by the viewer detection module 33 with the viewers tracked by the tracking module 34. The data association module 35 may start a tracking process for each viewer detected by the viewer detection module 33 that is not a false positive, and that does not overlap more than a given threshold with the currently tracked viewers. In this sense, a false positive is defined as a detection which has not been detected in at least a certain percentage of frames.
In one example of the operation of the data association module 35, only a single viewer treated as the primary viewer is tracked. Detections of viewers from the viewer detection module 33 are assigned a position, and assigned the same identifier as previously detected observations of a viewer having sufficient spatial overlap. In this example, the viewer tracking module 32 stores a history of past appearances of a viewer and uses this as a robustness measure in deciding whether to remove a viewer from the list of tracked viewers. In respect of each previously tracked viewer, if the viewer is not successfully tracked in the current frame or if there is not enough overlap of this viewer's most recent bounding box with any current detection, a count of untracked frames for this viewer is incremented. If the count of untracked frames is greater than a certain threshold, this viewer is removed from list of tracked viewers. Otherwise, the tracking module 34 is supplied the current detection position from the viewer detection module 33 and tracks the viewer who is in that position.
In another example of the operation of the data association module 35, all viewers are tracked, including a primary viewer and secondary viewers. This example may use a Joint Probabilistic Data Association Filter (JPDAF) to predict which detection to assign to which viewer, for example as disclosed in Bar-Shalom and Li “Multitarget-Multisensor Tracking: Principles and Techniques” 1995, which is herein incorporated by reference in its entirety.
The identification module 31 also includes an identity lookup module 36 and an authentication module 37. The viewer tracking module 32 supplies a first output to the identity lookup module 36 that includes, for each viewer detected by the viewer detection module 33 (if any), a unique identifier and image information derived from the captured image. The viewer tracking module 32 also supplies a second output to the authentication module 37 that includes, for each viewer detected by the viewer detection module 33 (if any), a unique identifier and the position of viewer detected by the tracking module 34.
The identity lookup module 36 receives the first output from the viewer tracking module 32 and the output signal from the tag sensor 25 (or alternatively just one of those). On the basis of these signals, identity lookup module 36 attempts to identify any viewers and derives viewer permission information in respect of successfully identified viewers.
The identity lookup module 36 uses a database storing viewer permission information related to individual viewers. This may be related to image information for the viewers and to identifiers for the viewers. The database may be local to the directional display apparatus 20, for example stored in a memory thereof, or may be remote, in which case it can be accessed via a network connection.
The identity lookup module 36 may use the first output from the viewer tracking module 32 that includes image information derived from the captured image by comparing that derived image information with the image information in the database. This comparison may use conventional image comparison techniques. In the event of a match, the viewer permission information associated with the matching image information in the database is retrieved.
The identity lookup module 36 may use the output signal from the tag sensor 25 that includes an identifier (i.e. the determined identity) of each detected viewer derived from the detected tag by comparing the derived identifier with identifiers in the database, In the event of a match, the viewer permission information associated with the matching identifier in the database is retrieved.
Where the identity lookup module 36 uses both the first output from the viewer tracking module 32 and the output signal from the tag sensor 25, the information from each technique is combined to provide a union of the viewer permission information provided by each technique. Alternatively only one of the techniques may be used to provide viewer permission information.
In either case, where the viewer is on the database, then the viewer is identified and the identities of the viewers and their viewer permission information is retrieved and supplied to the authentication module 37.
The identities of the viewers and their viewer permission information are also supplied to a context module 38 that is arranged as shown in
The authentication rule may take account of the viewer permission information of the viewers, as determined by the identity lookup module 36. The authentication rule may also take account of other information. Generally speaking, the authentication rule may take account of any information that is available, thus providing significant power in the authentication process. Some non-limitative examples of the authentication rule that may be implemented in the context module 38 are as follows. These and other examples may be used individually, or in any combination.
The authentication rule may also take account of the viewer permission information in various alternative ways.
In one simple example, the viewer permission information may specify that given viewers are authorized or not. In that case, the authentication rule may be to permit viewing in respect of viewers who are both present in the database and authorized (i.e. a “white list”, wherein unknown viewers are forbidden from viewing). Alternatively, the authentication rule may be to permit viewing in respect of viewers unless they are present in the database and not authorized (i.e. a “black list”, wherein unknown viewers are permitted to view).
The authentication rule may be to permit individuals to view or not on the basis of only of their own viewer permission information, i.e. applying the viewer permission information on an individual basis. Alternatively, the viewer permission information may take account of the set of viewers who are present, i.e. applying the viewer permission information on a group basis. For example, the viewer permission information for one individual may provide (or forbid) permission for all viewers present, in which case the authentication rule may be to permit (or forbid) viewing by all viewers present on the basis of the presence of an individual having such viewer permission information. By way of example, this may allow a senior individual in an organization to authorize viewing by others.
The viewer permission information may indicate relationships between the viewers. In that case, the authentication rule may take account of those indicated relationships. For example where the primary viewer who is first observed by the viewer tracking module 32 is permitted to view an image, then the authentication rule may be to permit viewing by secondary viewers who are observed later, if they have a predetermined relationship with the primary viewer. By way of example, this may allow teams in an organization to view together.
As an alternative, the authentication rule may provide set permissions on the basis of the order in which the viewers are observed by the viewer tracking module 32. This may occur without reference to the viewer permission information, or in combination with the viewer permission information. In one example, the primary viewer who is first observed by the viewer tracking module 32 may be permitted to view an image and secondary viewers who are subsequently observed by the viewer tracking module 32 may be not be permitted to view an image, irrespective of their identity. In another example, the primary viewer who is first observed by the viewer tracking module 32 may be permitted to view an image and secondary viewers who are subsequently observed by the viewer tracking module 32 may be permitted or not on the basis of their viewer permission information.
The authentication rule may take account of image information concerning the viewers that is derived from the image capture by the capture system 24. For example, the width of the faces of any viewers may be determined as indicating the distance of the viewers from the directional display device 1.
In one example, the authentication rule may allow or forbid viewing in dependence on the determined width of the faces of the viewers, for example only permitting viewers for whom the width is above a threshold taken to indicate that the viewers are close to the directional display device 1.
In another example, a primary viewer who is first observed by the viewer tracking module 32 is permitted to view an image irrespective of the output of the identity lookup module 36 and secondary viewers who are subsequently observed by the viewer tracking module 32 are not permitted. If the primary observer ceases to be observed, but later a viewer reappears in similar location and having a similar width, then the reappearing viewer is permitted viewing, i.e. on the assumption that it is likely to be the same individual. However, after a predetermined timeout after the primary observer ceases to be observed, the authentication rule is reset such that the next viewer who is observed is taken as the primary viewer.
The context module 38 may also be supplied with any or all of output signals from the camera system 24, output signals from the location sensor 27 and the list 29 of system processes, which may be applied in the authentication rule.
In some examples, the authentication rule may decide whether viewers are permitted viewing in dependence on information about the location of the directional display device 1. The authentication rule may take account of the location alone, for example permitting or forbidding viewing in particular locations, or in combination with the viewer permission information, for example by the viewer permission information being location-specific.
In one type of example using location, the information about the location of the directional display device 1 may be the geographical location of the directional display device 1 represented by the output signal from the location sensor 27.
In another type of example, the information about the location of the directional display device 1 may be derived from the image captured by the camera system 24 in a location identification module 40. Any suitable image analysis technique that provides information about the scene of the captured image may be applied. For example, the captured image may be classified using a suitably trained image classification system, e.g. as disclosed in Karen Simonyan & Andrew Zisserman, “Very Deep Convolutional Networks For Large-Scale Image Recognition”, ICLR 2015, which is herein incorporated by reference in its entirety, or using Bag-of-Words indexing of suitable sparse features. The output may be a label describing the environment in the location of the directional display device 1. Such techniques may for example indicate the location as being in a car, in a domestic home, or a workplace, etc.
In some examples, the authentication rule may decide whether viewers are permitted viewing in dependence on the list 29 of system processes. For example, the label assigned to the viewers can be adjusted according to the use case of the directional display device, including the software being executed and/or the nature of the image being viewed. In one example, where the image is deemed to be non-sensitive for example a film, all viewers may be permitted viewing. This may be used without reference to the viewer permission information, or in combination with the viewer permission information. In one example, the viewer permission information may be specific to certain software is being executed.
The authentication rule may take account of the time and/or date.
The output of the context module 38 that indicates whether the viewers are permitted to view the displayed image is supplied to the authentication module 37 of the identification module 31.
The authentication module 37 uses the second output of the viewer tracking module 32 that includes, for each viewer detected by the viewer detection module 33 (if any), a unique identifier and the position of viewer detected by the tracking module 34, together with the output of the context module 38, to decide whether viewers are permitted to view the displayed image. On the basis of that decision, the authentication module 37 sets the mode of operation causing selection of the luminous flux profile.
The authentication module 37 may perform this operation of setting the mode of operation in accordance with the flow chart shown in
In step S1, a default mode of operation is set. This may be the wide angle mode, for example as shown in
In step S2, the first viewer to be observed, as identified by the second output of the viewer tracking module 32, is detected and assigned as the primary viewer.
In step S3, it is determined whether the primary viewer is permitted viewing of the displayed image, as indicated by the output of the context module 38. If not, then in step S4, the blank mode is set, and the method pauses until the primary viewer ceases to be observed, after which the method reverts to step S2.
As an alternative step S3 may be replaced by a step of setting the wide angle mode (if that is not already the default mode) and the method continues to step S6 described below. In this alternative, the primary viewer is always permitted to view the displayed image.
If it is determined in step S3 that the primary viewer is permitted viewing, then in step S5, the wide angle mode is set, if it has not already been set in step S1. Thus, at this stage the wide angle mode is used, providing a wide viewing angle.
In step S6, any further viewers to be observed, as identified by the second output of the viewer tracking module 32, are detected and assigned as secondary viewers.
In step S7, it is determined whether the secondary viewer assigned in step S6 is permitted viewing of the displayed image, as indicated by the output of the context module 38. If not, then in step S4, the privacy mode is set. In this example which involves adjustment by changing the width of the viewing mode, the width of the viewing window is decreased in the privacy mode set in step S8, for example as shown in
If it is determined in step S7 that the secondary viewer is permitted viewing, then the method reverts to step S6 to detect any further secondary viewers. If further secondary viewers are detected, then the method repeats step S7 in case the additional secondary viewer affects the decision whether to permit viewing.
The authentication module 37 provides an output indicating the set mode of operation to the pattern generator 30 as the basis for generating the luminous flux profile that is supplied to the driver circuit 23. As well as selecting between luminous flux profiles in the wide angle mode and the privacy mode, the pattern generator selects other aspects the luminous flux profile as follows.
The pattern generator 30 is supplied with the second output of viewer tracking module 32, via the authentication module 37. In the privacy mode, the pattern generator 30 shifts the luminous flux profile, for example as shown in
Another functional module of the control circuit 22 is a proximity module 41 arranged as shown in
The parameters may include a depth estimate of the primary viewer derived from the output of the proximity sensor 28. In one example, where only the output signal of the proximity sensor 28 is used and the proximity sensor 28 yields a single depth value, the signal filter 42 includes a low-pass filter which removes high-frequency noise, e.g. due to noise in the proximity sensor 28, to derive the depth estimate. Such a depth estimate may alternatively be derived from the second output of the viewer tracking module 32.
The pattern generator 30 controls the luminous flux profile in accordance with the parameters supplied from the proximity module 41. For example, in the case that the parameters include a depth estimate of the primary viewer, the width of the luminous flux profile may be adjusted in accordance with the depth estimate, typically to widen the viewing window with increasing proximity of the primary viewer.
Another functional module of the control circuit 22 is a motion sensing module 43 arranged as shown in
The parameters may include a parameter representing relative movement between the primary viewer and the directional display device 1.
The parameter may be derived from the output signal of the motion sensor 26. As this represents the detected motion of the directional display apparatus, it effectively represents relative movement between the primary viewer and the directional display device 1.
Additionally or instead, the parameter may be derived from the second output of the tracking module 32. As this indicates the position of the viewers determined from analysis of the image captured by the camera system 24, change in the position derived by the output of the filter represents relative movement between the primary viewer and the directional display device 1.
The relative movement represented by the parameter may include a linear motion of the viewer relative to the directional display device 1 laterally of the viewing window. By way of example, the parameter may represent the velocity of this motion.
Additionally or instead, the relative movement represented by the parameter may include vibratory movement of the directional display device 1 relative to the viewer. By way of example, the parameter may represent a covariance, for example a covariance matrix, of the velocity of movement. In one example where the output signal of the motion sensor 28 is used and where output signal of the motion sensor 28 represents acceleration, the signal filter 44 may be a high-pass filter which removes low-frequency noise, e.g. due to slow velocity changes.
The pattern generator 30 controls the luminous flux profile in accordance with the parameters supplied from the motion sensing module 43. In particular, in the privacy mode, the width of the luminous flux profile is increased in response to the parameters indicating detection of relative movement between the primary viewer and the directional display device 1. The width may be increased as described above, for example by changing from the first luminous flux profile of
Thus, the above example relates to a case where the position and width of the viewing window are variable, wherein the position of the viewing window is controlled to track the determined position of the viewer, and wherein the viewing window is adjusted in response to detecting the presence of one or more secondary viewers by reducing the width of the viewing window (in step S8). However, various modifications are possible. Some non-limitative examples of possible modifications are as follows.
A first possible modification is that only the width of the viewing windows is variable, not the position of the viewing windows. In that case, the position of the viewing window is not controlled to track the determined position of the viewer, but the viewing window may still be adjusted in response to detecting the presence of one or more secondary viewers by decreasing the width of the viewing window.
A second possible modification is that the viewing window may be adjusted in response to detecting the presence of one or more secondary viewers by shifting the position of the viewing window away from the secondary viewer, instead of decreasing the width of the viewing window. This may be achieved by modifying the operation in accordance with the flow chart shown in
Firstly, step S5 is modified so that, instead of wide angle mode being set, the first luminous flux profile of the privacy mode is set, for example as shown in
Secondly, step S8 is modified so that the first luminous flux profile of the privacy mode is set but with a shifted position, so that the position of the viewing window is shifted away from the position of the secondary viewer, as determined by the viewer tracking module 32. This adjustment may be made without changing the width of the viewing window (although optionally the width of the viewing window could additionally be decreased). The shift of position is chosen so that the image is still visible to the primary viewer. However, by shifting the viewing window away from the secondary viewer the visibility of the image to the secondary viewer is reduced.
In this second modification, the position of the viewing window may continue to be controlled to track the determined position of the viewer, as described above.
Also incorporated by reference herein in their entireties are U.S. Patent Publ. No. 2013/0321599, U.S. Patent Publ. No. 2015/0378085, and U.S. patent application Ser. No. 15/165,960.
While various embodiments in accordance with the principles disclosed herein have been described above, it should be understood that they have been presented by way of example only, and not limitation. Thus, the breadth and scope of this disclosure should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with any claims and their equivalents issuing from this disclosure. Furthermore, the above advantages and features are provided in described embodiments, but shall not limit the application of such issued claims to processes and structures accomplishing any or all of the above advantages.
Additionally, the section headings herein are provided for consistency with the suggestions under 37 CFR 1.77 or otherwise to provide organizational cues. These headings shall not limit or characterize the embodiment(s) set out in any claims that may issue from this disclosure. Specifically and by way of example, although the headings refer to a “Technical Field,” the claims should not be limited by the language chosen under this heading to describe the so-called field. Further, a description of a technology in the “Background” is not to be construed as an admission that certain technology is prior art to any embodiment(s) in this disclosure. Neither is the “Summary” to be considered as a characterization of the embodiment(s) set forth in issued claims. Furthermore, any reference in this disclosure to “invention” in the singular should not be used to argue that there is only a single point of novelty in this disclosure. Multiple embodiments may be set forth according to the limitations of the multiple claims issuing from this disclosure, and such claims accordingly define the embodiment(s), and their equivalents, that are protected thereby. In all instances, the scope of such claims shall be considered on their own merits in light of this disclosure, but should not be constrained by the headings set forth herein.
This application claims priority to U.S. Provisional Patent Appl. No. 62/246,584, entitled “Intelligent privacy system, apparatus, and method thereof” filed Oct. 26, 2015 and to U.S. Provisional Patent Appl. No. 62/261,151, entitled “Intelligent privacy system, apparatus, and method thereof” filed Nov. 30, 2015, which are both herein incorporated by reference in their entireties.
Number | Name | Date | Kind |
---|---|---|---|
1128979 | Hess | Feb 1915 | A |
1970311 | Ives | Aug 1934 | A |
2133121 | Stearns | Oct 1938 | A |
2247969 | Lemuel | Jul 1941 | A |
2480178 | Zinberg | Aug 1949 | A |
2810905 | Barlow | Oct 1957 | A |
3409351 | Winnek | Nov 1968 | A |
3715154 | Bestenreiner | Feb 1973 | A |
4057323 | Ward | Nov 1977 | A |
4528617 | Blackington | Jul 1985 | A |
4542958 | Young | Sep 1985 | A |
4621898 | Cohen | Nov 1986 | A |
4804253 | Stewart | Feb 1989 | A |
4807978 | Grinberg et al. | Feb 1989 | A |
4829365 | Eichenlaub | May 1989 | A |
4914553 | Hamada et al. | Apr 1990 | A |
4974941 | Gibbons et al. | Dec 1990 | A |
5005108 | Pristash et al. | Apr 1991 | A |
5035491 | Kawagishi et al. | Jul 1991 | A |
5278608 | Taylor et al. | Jan 1994 | A |
5347644 | Sedlmayr | Sep 1994 | A |
5349419 | Taguchi et al. | Sep 1994 | A |
5459592 | Shibatani et al. | Oct 1995 | A |
5466926 | Sasano et al. | Nov 1995 | A |
5510831 | Mayhew | Apr 1996 | A |
5528720 | Winston et al. | Jun 1996 | A |
5581402 | Taylor | Dec 1996 | A |
5588526 | Fantone et al. | Dec 1996 | A |
5658490 | Sharp et al. | Aug 1997 | A |
5697006 | Taguchi et al. | Dec 1997 | A |
5703667 | Ochiai | Dec 1997 | A |
5715028 | Abileah et al. | Feb 1998 | A |
5727107 | Umemoto et al. | Mar 1998 | A |
5771066 | Barnea | Jun 1998 | A |
5796451 | Kim | Aug 1998 | A |
5808784 | Ando et al. | Sep 1998 | A |
5808792 | Woodgate et al. | Sep 1998 | A |
5835166 | Hall et al. | Nov 1998 | A |
5850580 | Taguchi et al. | Dec 1998 | A |
5875055 | Morishima et al. | Feb 1999 | A |
5894361 | Yamazaki et al. | Apr 1999 | A |
5896225 | Chikazawa | Apr 1999 | A |
5903388 | Sedlmayr | May 1999 | A |
5933276 | Magee | Aug 1999 | A |
5956001 | Sumida et al. | Sep 1999 | A |
5959664 | Woodgate | Sep 1999 | A |
5959702 | Goodman | Sep 1999 | A |
5969850 | Harrold et al. | Oct 1999 | A |
5971559 | Ishikawa et al. | Oct 1999 | A |
6008484 | Woodgate et al. | Dec 1999 | A |
6014164 | Woodgate et al. | Jan 2000 | A |
6023315 | Harrold et al. | Feb 2000 | A |
6044196 | Winston et al. | Mar 2000 | A |
6055013 | Woodgate et al. | Apr 2000 | A |
6055103 | Woodgate et al. | Apr 2000 | A |
6061179 | Inoguchi et al. | May 2000 | A |
6061489 | Ezra et al. | May 2000 | A |
6064424 | Berkel et al. | May 2000 | A |
6075557 | Holliman et al. | Jun 2000 | A |
6094216 | Taniguchi et al. | Jul 2000 | A |
6099758 | Verrall et al. | Aug 2000 | A |
6108059 | Yang | Aug 2000 | A |
6118584 | Berkel et al. | Sep 2000 | A |
6128054 | Schwarzenberger | Oct 2000 | A |
6144118 | Cahill et al. | Nov 2000 | A |
6144433 | Tillin et al. | Nov 2000 | A |
6172723 | Inoue et al. | Jan 2001 | B1 |
6199995 | Umemoto et al. | Mar 2001 | B1 |
6204904 | Tillin et al. | Mar 2001 | B1 |
6219113 | Takahara | Apr 2001 | B1 |
6222672 | Towler et al. | Apr 2001 | B1 |
6224214 | Martin et al. | May 2001 | B1 |
6232592 | Sugiyama | May 2001 | B1 |
6256447 | Laine | Jul 2001 | B1 |
6262786 | Perlo et al. | Jul 2001 | B1 |
6295109 | Kubo et al. | Sep 2001 | B1 |
6302541 | Grossmann | Oct 2001 | B1 |
6305813 | Lekson et al. | Oct 2001 | B1 |
6335999 | Winston et al. | Jan 2002 | B1 |
6373637 | Gulick et al. | Apr 2002 | B1 |
6377295 | Woodgate et al. | Apr 2002 | B1 |
6392727 | Larson et al. | May 2002 | B1 |
6422713 | Fohl et al. | Jul 2002 | B1 |
6437915 | Moseley et al. | Aug 2002 | B2 |
6456340 | Margulis | Sep 2002 | B1 |
6464365 | Gunn et al. | Oct 2002 | B1 |
6476850 | Erbey | Nov 2002 | B1 |
6481849 | Martin et al. | Nov 2002 | B2 |
6654156 | Crossland et al. | Nov 2003 | B1 |
6663254 | Ohsumi | Dec 2003 | B2 |
6724452 | Takeda et al. | Apr 2004 | B1 |
6731355 | Miyashita | May 2004 | B2 |
6736512 | Balogh | May 2004 | B2 |
6801243 | Berkel | Oct 2004 | B1 |
6816158 | Lemelson et al. | Nov 2004 | B1 |
6825985 | Brown et al. | Nov 2004 | B2 |
6847354 | Vranish | Jan 2005 | B2 |
6847488 | Travis | Jan 2005 | B2 |
6859240 | Brown et al. | Feb 2005 | B1 |
6867828 | Taira et al. | Mar 2005 | B2 |
6870671 | Travis | Mar 2005 | B2 |
6883919 | Travis | Apr 2005 | B2 |
7052168 | Epstein et al. | May 2006 | B2 |
7058252 | Woodgate et al. | Jun 2006 | B2 |
7067985 | Adachi | Jun 2006 | B2 |
7073933 | Gotoh et al. | Jul 2006 | B2 |
7091931 | Yoon | Aug 2006 | B2 |
7101048 | Travis | Sep 2006 | B2 |
7136031 | Lee et al. | Nov 2006 | B2 |
7163319 | Kuo et al. | Jan 2007 | B2 |
7215391 | Kuan et al. | May 2007 | B2 |
7215415 | Maehara et al. | May 2007 | B2 |
7215475 | Woodgate et al. | May 2007 | B2 |
7227602 | Jeon et al. | Jun 2007 | B2 |
7239293 | Perlin et al. | Jul 2007 | B2 |
7365908 | Dolgoff | Apr 2008 | B2 |
7375886 | Lipton et al. | May 2008 | B2 |
7410286 | Travis | Aug 2008 | B2 |
7430358 | Qi et al. | Sep 2008 | B2 |
7492346 | Manabe et al. | Feb 2009 | B2 |
7524542 | Kim et al. | Apr 2009 | B2 |
7528893 | Schultz et al. | May 2009 | B2 |
7528913 | Kobayashi | May 2009 | B2 |
7545429 | Travis | Jun 2009 | B2 |
7587117 | Winston et al. | Sep 2009 | B2 |
7614777 | Koganezawa et al. | Nov 2009 | B2 |
7633586 | Winlow et al. | Dec 2009 | B2 |
7660047 | Travis et al. | Feb 2010 | B1 |
7750981 | Shestak et al. | Jul 2010 | B2 |
7750982 | Nelson et al. | Jul 2010 | B2 |
7766534 | Iwasaki | Aug 2010 | B2 |
7771102 | Iwasaki | Aug 2010 | B2 |
7834834 | Takatani et al. | Nov 2010 | B2 |
7944428 | Travis | May 2011 | B2 |
7970246 | Travis et al. | Jun 2011 | B2 |
7976208 | Travis | Jul 2011 | B2 |
7991257 | Coleman | Aug 2011 | B1 |
8016475 | Travis | Sep 2011 | B2 |
8098350 | Sakai et al. | Jan 2012 | B2 |
8154686 | Mather et al. | Apr 2012 | B2 |
8216405 | Emerton et al. | Jul 2012 | B2 |
8223296 | Lee et al. | Jul 2012 | B2 |
8237876 | Tan et al. | Aug 2012 | B2 |
8249408 | Coleman | Aug 2012 | B2 |
8251562 | Kuramitsu et al. | Aug 2012 | B2 |
8262271 | Tillin et al. | Sep 2012 | B2 |
8325295 | Sugita et al. | Dec 2012 | B2 |
8354806 | Travis et al. | Jan 2013 | B2 |
8477261 | Travis et al. | Jul 2013 | B2 |
8502253 | Min | Aug 2013 | B2 |
8534901 | Panagotacos et al. | Sep 2013 | B2 |
8556491 | Lee | Oct 2013 | B2 |
8646931 | Choi et al. | Feb 2014 | B2 |
8651725 | Ie et al. | Feb 2014 | B2 |
8714804 | Kim et al. | May 2014 | B2 |
8752995 | Park | Jun 2014 | B2 |
8801260 | Urano et al. | Aug 2014 | B2 |
8939595 | Choi et al. | Jan 2015 | B2 |
8973149 | Buck | Mar 2015 | B2 |
9195087 | Terashima | Nov 2015 | B2 |
9197884 | Lee et al. | Nov 2015 | B2 |
9274260 | Urano et al. | Mar 2016 | B2 |
9304241 | Wang et al. | Apr 2016 | B2 |
9324234 | Ricci et al. | Apr 2016 | B2 |
9448355 | Urano et al. | Sep 2016 | B2 |
9501036 | Kang et al. | Nov 2016 | B2 |
9519153 | Robinson et al. | Dec 2016 | B2 |
10054732 | Robinson et al. | Aug 2018 | B2 |
10126575 | Robinson et al. | Nov 2018 | B1 |
10228505 | Robinson et al. | Mar 2019 | B2 |
10303030 | Robinson et al. | May 2019 | B2 |
10401638 | Robinson et al. | Sep 2019 | B2 |
10488705 | Xu et al. | Nov 2019 | B2 |
10649248 | Jiang et al. | May 2020 | B1 |
10649259 | Lee et al. | May 2020 | B2 |
20010001566 | Moseley et al. | May 2001 | A1 |
20010050686 | Allen | Dec 2001 | A1 |
20020018299 | Daniell | Feb 2002 | A1 |
20020024529 | Miller et al. | Feb 2002 | A1 |
20020113246 | Nagai et al. | Aug 2002 | A1 |
20020113866 | Taniguchi et al. | Aug 2002 | A1 |
20020171793 | Sharp et al. | Nov 2002 | A1 |
20030046839 | Oda et al. | Mar 2003 | A1 |
20030089956 | Allen et al. | May 2003 | A1 |
20030107686 | Sato et al. | Jun 2003 | A1 |
20030117790 | Lee et al. | Jun 2003 | A1 |
20030133191 | Morita et al. | Jul 2003 | A1 |
20030137738 | Ozawa et al. | Jul 2003 | A1 |
20030137821 | Gotoh et al. | Jul 2003 | A1 |
20040008877 | Leppard et al. | Jan 2004 | A1 |
20040015729 | Elms et al. | Jan 2004 | A1 |
20040021809 | Sumiyoshi et al. | Feb 2004 | A1 |
20040042233 | Suzuki et al. | Mar 2004 | A1 |
20040046709 | Yoshino | Mar 2004 | A1 |
20040100598 | Adachi et al. | May 2004 | A1 |
20040105264 | Spero | Jun 2004 | A1 |
20040108971 | Waldern et al. | Jun 2004 | A1 |
20040109303 | Olczak | Jun 2004 | A1 |
20040125430 | Kasajima et al. | Jul 2004 | A1 |
20040135741 | Tomisawa et al. | Jul 2004 | A1 |
20040145703 | O'Connor et al. | Jul 2004 | A1 |
20040170011 | Kim et al. | Sep 2004 | A1 |
20040240777 | Woodgate et al. | Dec 2004 | A1 |
20040263968 | Kobayashi et al. | Dec 2004 | A1 |
20040263969 | Lipton et al. | Dec 2004 | A1 |
20050007753 | Hees et al. | Jan 2005 | A1 |
20050094295 | Yamashita et al. | May 2005 | A1 |
20050110980 | Maehara et al. | May 2005 | A1 |
20050111100 | Mather et al. | May 2005 | A1 |
20050117186 | Li et al. | Jun 2005 | A1 |
20050135116 | Epstein et al. | Jun 2005 | A1 |
20050157225 | Toyooka et al. | Jul 2005 | A1 |
20050174768 | Conner | Aug 2005 | A1 |
20050180167 | Hoelen et al. | Aug 2005 | A1 |
20050190326 | Jeon et al. | Sep 2005 | A1 |
20050190329 | Okumura | Sep 2005 | A1 |
20050190345 | Dubin et al. | Sep 2005 | A1 |
20050219693 | Hartkop et al. | Oct 2005 | A1 |
20050237488 | Yamasaki et al. | Oct 2005 | A1 |
20050254127 | Evans et al. | Nov 2005 | A1 |
20050264717 | Chien et al. | Dec 2005 | A1 |
20050274956 | Bhat | Dec 2005 | A1 |
20050276071 | Sasagawa et al. | Dec 2005 | A1 |
20050280637 | Ikeda et al. | Dec 2005 | A1 |
20060012845 | Edwards | Jan 2006 | A1 |
20060056166 | Yeo et al. | Mar 2006 | A1 |
20060082702 | Jacobs et al. | Apr 2006 | A1 |
20060114664 | Sakata et al. | Jun 2006 | A1 |
20060132423 | Travis | Jun 2006 | A1 |
20060139447 | Unkrich | Jun 2006 | A1 |
20060158729 | Vissenberg et al. | Jul 2006 | A1 |
20060176912 | Anikitchev | Aug 2006 | A1 |
20060203162 | Ito et al. | Sep 2006 | A1 |
20060203200 | Koide | Sep 2006 | A1 |
20060215129 | Alasaarela et al. | Sep 2006 | A1 |
20060215244 | Yosha et al. | Sep 2006 | A1 |
20060221642 | Daiku | Oct 2006 | A1 |
20060227427 | Dolgoff | Oct 2006 | A1 |
20060244884 | Jeon et al. | Nov 2006 | A1 |
20060244918 | Cossairt et al. | Nov 2006 | A1 |
20060250580 | Silverstein et al. | Nov 2006 | A1 |
20060262258 | Wang et al. | Nov 2006 | A1 |
20060262376 | Mather et al. | Nov 2006 | A1 |
20060262558 | Cornelissen | Nov 2006 | A1 |
20060268207 | Tan et al. | Nov 2006 | A1 |
20060269213 | Hwang et al. | Nov 2006 | A1 |
20060284974 | Lipton et al. | Dec 2006 | A1 |
20060285040 | Kobayashi | Dec 2006 | A1 |
20060291053 | Robinson et al. | Dec 2006 | A1 |
20060291243 | Niioka et al. | Dec 2006 | A1 |
20070008406 | Shestak et al. | Jan 2007 | A1 |
20070013624 | Bourhill | Jan 2007 | A1 |
20070025680 | Winston et al. | Feb 2007 | A1 |
20070035706 | Margulis | Feb 2007 | A1 |
20070035829 | Woodgate et al. | Feb 2007 | A1 |
20070035964 | Olczak | Feb 2007 | A1 |
20070047254 | Schardt et al. | Mar 2007 | A1 |
20070064163 | Tan et al. | Mar 2007 | A1 |
20070081110 | Lee | Apr 2007 | A1 |
20070085105 | Beeson et al. | Apr 2007 | A1 |
20070109401 | Lipton et al. | May 2007 | A1 |
20070115551 | Spilman et al. | May 2007 | A1 |
20070115552 | Robinson et al. | May 2007 | A1 |
20070139772 | Wang | Jun 2007 | A1 |
20070153160 | Lee et al. | Jul 2007 | A1 |
20070183466 | Son et al. | Aug 2007 | A1 |
20070188667 | Schwerdtner | Aug 2007 | A1 |
20070189701 | Chakmakjian et al. | Aug 2007 | A1 |
20070223251 | Liao | Sep 2007 | A1 |
20070223252 | Lee et al. | Sep 2007 | A1 |
20070285775 | Lesage et al. | Dec 2007 | A1 |
20080068329 | Shestak et al. | Mar 2008 | A1 |
20080079662 | Saishu et al. | Apr 2008 | A1 |
20080084519 | Brigham et al. | Apr 2008 | A1 |
20080086289 | Brott | Apr 2008 | A1 |
20080128728 | Nemchuk et al. | Jun 2008 | A1 |
20080158491 | Zhu et al. | Jul 2008 | A1 |
20080225205 | Travis | Sep 2008 | A1 |
20080259012 | Fergason | Oct 2008 | A1 |
20080285310 | Aylward et al. | Nov 2008 | A1 |
20080291359 | Miyashita | Nov 2008 | A1 |
20080297431 | Yuuki et al. | Dec 2008 | A1 |
20080297459 | Sugimoto et al. | Dec 2008 | A1 |
20080304282 | Mi et al. | Dec 2008 | A1 |
20080316198 | Fukushima et al. | Dec 2008 | A1 |
20080316768 | Travis | Dec 2008 | A1 |
20090014700 | Metcalf et al. | Jan 2009 | A1 |
20090016057 | Rinko | Jan 2009 | A1 |
20090040426 | Mather et al. | Feb 2009 | A1 |
20090067156 | Bonnett et al. | Mar 2009 | A1 |
20090085894 | Gandhi et al. | Apr 2009 | A1 |
20090086509 | Omori et al. | Apr 2009 | A1 |
20090128735 | Larson et al. | May 2009 | A1 |
20090128746 | Kean et al. | May 2009 | A1 |
20090135623 | Kunimochi | May 2009 | A1 |
20090140656 | Kohashikawa et al. | Jun 2009 | A1 |
20090160757 | Robinson | Jun 2009 | A1 |
20090167651 | Benitez et al. | Jul 2009 | A1 |
20090174700 | Daiku | Jul 2009 | A1 |
20090174843 | Sakai et al. | Jul 2009 | A1 |
20090190072 | Nagata et al. | Jul 2009 | A1 |
20090190079 | Saitoh | Jul 2009 | A1 |
20090213298 | Mimura et al. | Aug 2009 | A1 |
20090213305 | Ohmuro et al. | Aug 2009 | A1 |
20090225380 | Schwerdtner et al. | Sep 2009 | A1 |
20090244415 | Ide | Oct 2009 | A1 |
20090278936 | Pastoor et al. | Nov 2009 | A1 |
20090290203 | Schwerdtner | Nov 2009 | A1 |
20100002296 | Choi et al. | Jan 2010 | A1 |
20100034987 | Fujii et al. | Feb 2010 | A1 |
20100040280 | McKnight | Feb 2010 | A1 |
20100053771 | Travis et al. | Mar 2010 | A1 |
20100091093 | Robinson | Apr 2010 | A1 |
20100091254 | Travis et al. | Apr 2010 | A1 |
20100128200 | Morishita et al. | May 2010 | A1 |
20100149459 | Yabuta et al. | Jun 2010 | A1 |
20100165598 | Chen et al. | Jul 2010 | A1 |
20100177113 | Gay et al. | Jul 2010 | A1 |
20100177387 | Travis et al. | Jul 2010 | A1 |
20100182542 | Nakamoto et al. | Jul 2010 | A1 |
20100188438 | Kang | Jul 2010 | A1 |
20100188602 | Feng | Jul 2010 | A1 |
20100205667 | Anderson et al. | Aug 2010 | A1 |
20100214135 | Bathiche et al. | Aug 2010 | A1 |
20100220260 | Sugita et al. | Sep 2010 | A1 |
20100231498 | Large et al. | Sep 2010 | A1 |
20100238376 | Sakai et al. | Sep 2010 | A1 |
20100277575 | Ismael et al. | Nov 2010 | A1 |
20100278480 | Vasylyev | Nov 2010 | A1 |
20100283930 | Park et al. | Nov 2010 | A1 |
20100289870 | Leister | Nov 2010 | A1 |
20100289989 | Adachi et al. | Nov 2010 | A1 |
20100295755 | Broughton et al. | Nov 2010 | A1 |
20100295920 | McGowan | Nov 2010 | A1 |
20100295930 | Ezhov | Nov 2010 | A1 |
20100300608 | Emerton et al. | Dec 2010 | A1 |
20100302135 | Larson et al. | Dec 2010 | A1 |
20100309296 | Harrold et al. | Dec 2010 | A1 |
20100321953 | Coleman et al. | Dec 2010 | A1 |
20100328438 | Ohyama et al. | Dec 2010 | A1 |
20110013417 | Saccomanno et al. | Jan 2011 | A1 |
20110018860 | Parry-Jones et al. | Jan 2011 | A1 |
20110019112 | Dolgoff | Jan 2011 | A1 |
20110032483 | Hruska et al. | Feb 2011 | A1 |
20110032724 | Kinoshita | Feb 2011 | A1 |
20110043142 | Travis et al. | Feb 2011 | A1 |
20110043501 | Daniel | Feb 2011 | A1 |
20110044056 | Travis et al. | Feb 2011 | A1 |
20110044579 | Travis et al. | Feb 2011 | A1 |
20110051237 | Hasegawa et al. | Mar 2011 | A1 |
20110187293 | Travis | Aug 2011 | A1 |
20110187635 | Lee et al. | Aug 2011 | A1 |
20110188120 | Tabirian et al. | Aug 2011 | A1 |
20110216266 | Travis | Sep 2011 | A1 |
20110221998 | Adachi et al. | Sep 2011 | A1 |
20110228183 | Hamagishi | Sep 2011 | A1 |
20110235359 | Liu et al. | Sep 2011 | A1 |
20110241983 | Chang | Oct 2011 | A1 |
20110242150 | Song et al. | Oct 2011 | A1 |
20110242277 | Do et al. | Oct 2011 | A1 |
20110242298 | Bathiche et al. | Oct 2011 | A1 |
20110255303 | Nichol et al. | Oct 2011 | A1 |
20110285927 | Schultz et al. | Nov 2011 | A1 |
20110286222 | Coleman | Nov 2011 | A1 |
20110292321 | Travis et al. | Dec 2011 | A1 |
20110310232 | Wilson et al. | Dec 2011 | A1 |
20110321143 | Angaluri | Dec 2011 | A1 |
20120002121 | Pirs et al. | Jan 2012 | A1 |
20120002136 | Nagata et al. | Jan 2012 | A1 |
20120002295 | Dobschal et al. | Jan 2012 | A1 |
20120008067 | Mun et al. | Jan 2012 | A1 |
20120013720 | Kadowaki et al. | Jan 2012 | A1 |
20120062991 | Mich et al. | Mar 2012 | A1 |
20120063166 | Panagotacos et al. | Mar 2012 | A1 |
20120075285 | Oyagi et al. | Mar 2012 | A1 |
20120081920 | Ie et al. | Apr 2012 | A1 |
20120086776 | Lo | Apr 2012 | A1 |
20120086875 | Yokota | Apr 2012 | A1 |
20120106193 | Kim et al. | May 2012 | A1 |
20120127573 | Robinson et al. | May 2012 | A1 |
20120147280 | Osterman et al. | Jun 2012 | A1 |
20120154450 | Aho et al. | Jun 2012 | A1 |
20120162966 | Kim et al. | Jun 2012 | A1 |
20120169838 | Sekine | Jul 2012 | A1 |
20120206050 | Spero | Aug 2012 | A1 |
20120235891 | Nishitani et al. | Sep 2012 | A1 |
20120236484 | Miyake | Sep 2012 | A1 |
20120243204 | Robinson | Sep 2012 | A1 |
20120243261 | Yamamoto et al. | Sep 2012 | A1 |
20120293721 | Ueyama | Nov 2012 | A1 |
20120294037 | Holman et al. | Nov 2012 | A1 |
20120299913 | Robinson et al. | Nov 2012 | A1 |
20120314145 | Robinson | Dec 2012 | A1 |
20120327101 | Blixt et al. | Dec 2012 | A1 |
20130039062 | Vinther et al. | Feb 2013 | A1 |
20130100097 | Martin | Apr 2013 | A1 |
20130101253 | Popovich et al. | Apr 2013 | A1 |
20130107174 | Yun et al. | May 2013 | A1 |
20130107340 | Wong et al. | May 2013 | A1 |
20130128165 | Lee et al. | May 2013 | A1 |
20130135588 | Popovich et al. | May 2013 | A1 |
20130169701 | Whitehead et al. | Jul 2013 | A1 |
20130242231 | Kurata et al. | Sep 2013 | A1 |
20130278544 | Cok | Oct 2013 | A1 |
20130293793 | Lu | Nov 2013 | A1 |
20130294684 | Lipton et al. | Nov 2013 | A1 |
20130300985 | Bulda | Nov 2013 | A1 |
20130307831 | Robinson | Nov 2013 | A1 |
20130307946 | Robinson et al. | Nov 2013 | A1 |
20130321340 | Seo et al. | Dec 2013 | A1 |
20130321599 | Harrold et al. | Dec 2013 | A1 |
20130328866 | Woodgate et al. | Dec 2013 | A1 |
20130335821 | Robinson et al. | Dec 2013 | A1 |
20140009508 | Woodgate et al. | Jan 2014 | A1 |
20140022619 | Woodgate et al. | Jan 2014 | A1 |
20140036361 | Woodgate et al. | Feb 2014 | A1 |
20140071382 | Scardato | Mar 2014 | A1 |
20140098418 | Lin | Apr 2014 | A1 |
20140111760 | Guo et al. | Apr 2014 | A1 |
20140126238 | Kao et al. | May 2014 | A1 |
20140132887 | Kurata | May 2014 | A1 |
20140201844 | Buck | Jul 2014 | A1 |
20140211125 | Kurata | Jul 2014 | A1 |
20140232960 | Schwartz et al. | Aug 2014 | A1 |
20140240344 | Tomono et al. | Aug 2014 | A1 |
20140240828 | Robinson et al. | Aug 2014 | A1 |
20140268358 | Kusaka et al. | Sep 2014 | A1 |
20140286043 | Sykora et al. | Sep 2014 | A1 |
20140289835 | Varshaysky et al. | Sep 2014 | A1 |
20140340728 | Taheri | Nov 2014 | A1 |
20140361990 | Leister | Dec 2014 | A1 |
20140368602 | Woodgate et al. | Dec 2014 | A1 |
20150055366 | Chang et al. | Feb 2015 | A1 |
20150116212 | Freed | Apr 2015 | A1 |
20150177447 | Woodgate et al. | Jun 2015 | A1 |
20150177563 | Cho et al. | Jun 2015 | A1 |
20150185398 | Chang et al. | Jul 2015 | A1 |
20150205157 | Sakai et al. | Jul 2015 | A1 |
20150268479 | Woodgate et al. | Sep 2015 | A1 |
20150286061 | Seo et al. | Oct 2015 | A1 |
20150286817 | Haddad et al. | Oct 2015 | A1 |
20150301400 | Kimura et al. | Oct 2015 | A1 |
20150346417 | Powell | Dec 2015 | A1 |
20150346532 | Do et al. | Dec 2015 | A1 |
20150378085 | Robinson et al. | Dec 2015 | A1 |
20160103264 | Lee et al. | Apr 2016 | A1 |
20160132721 | Bostick | May 2016 | A1 |
20160147074 | Kobayashi et al. | May 2016 | A1 |
20160154259 | Kim et al. | Jun 2016 | A1 |
20160216420 | Gaides et al. | Jul 2016 | A1 |
20160216540 | Cho et al. | Jul 2016 | A1 |
20160224106 | Liu | Aug 2016 | A1 |
20160238869 | Osterman et al. | Aug 2016 | A1 |
20160334898 | Kwak et al. | Nov 2016 | A1 |
20160349444 | Robinson et al. | Dec 2016 | A1 |
20160356943 | Choi et al. | Dec 2016 | A1 |
20160357046 | Choi et al. | Dec 2016 | A1 |
20170003436 | Inoue et al. | Jan 2017 | A1 |
20170031206 | Smith et al. | Feb 2017 | A1 |
20170090103 | Holman | Mar 2017 | A1 |
20170092187 | Bergquist | Mar 2017 | A1 |
20170092229 | Greenebaum et al. | Mar 2017 | A1 |
20170115485 | Saito et al. | Apr 2017 | A1 |
20170123241 | Su et al. | May 2017 | A1 |
20170139110 | Woodgate et al. | May 2017 | A1 |
20170168633 | Kwak et al. | Jun 2017 | A1 |
20170205558 | Hirayama et al. | Jul 2017 | A1 |
20170236494 | Sommerlade et al. | Aug 2017 | A1 |
20170269283 | Wang et al. | Sep 2017 | A1 |
20170269285 | Hirayama et al. | Sep 2017 | A1 |
20170329399 | Azam | Nov 2017 | A1 |
20170336661 | Harrold et al. | Nov 2017 | A1 |
20170339398 | Woodgate et al. | Nov 2017 | A1 |
20170343715 | Fang et al. | Nov 2017 | A1 |
20180014007 | Brown | Jan 2018 | A1 |
20180052346 | Sakai et al. | Feb 2018 | A1 |
20180082068 | Lancioni et al. | Mar 2018 | A1 |
20180095581 | Hwang et al. | Apr 2018 | A1 |
20180113334 | Fang et al. | Apr 2018 | A1 |
20180188576 | Xu et al. | Jul 2018 | A1 |
20180188603 | Fang et al. | Jul 2018 | A1 |
20180196275 | Robinson et al. | Jul 2018 | A1 |
20180210243 | Fang et al. | Jul 2018 | A1 |
20180231811 | Wu | Aug 2018 | A1 |
20180252949 | Klippstein et al. | Sep 2018 | A1 |
20180259799 | Kroon | Sep 2018 | A1 |
20180259812 | Goda et al. | Sep 2018 | A1 |
20180321523 | Robinson et al. | Nov 2018 | A1 |
20180321553 | Robinson et al. | Nov 2018 | A1 |
20180329245 | Robinson et al. | Nov 2018 | A1 |
20180364526 | Finnemeyer et al. | Dec 2018 | A1 |
20190086706 | Robinson et al. | Mar 2019 | A1 |
20190121173 | Robinson et al. | Apr 2019 | A1 |
20190154896 | Yanai | May 2019 | A1 |
20190196236 | Chen et al. | Jun 2019 | A1 |
20190197928 | Schubert et al. | Jun 2019 | A1 |
20190215509 | Woodgate et al. | Jul 2019 | A1 |
20190227366 | Harrold et al. | Jul 2019 | A1 |
20190235304 | Tamada et al. | Aug 2019 | A1 |
20190250458 | Robinson et al. | Aug 2019 | A1 |
20190293858 | Woodgate et al. | Sep 2019 | A1 |
20190293983 | Robinson et al. | Sep 2019 | A1 |
20190353944 | Acreman et al. | Nov 2019 | A1 |
20200159055 | Robinson et al. | May 2020 | A1 |
20200225402 | Ihas et al. | Jul 2020 | A1 |
Number | Date | Country |
---|---|---|
2222313 | Jun 1998 | CA |
1142869 | Feb 1997 | CN |
1377453 | Oct 2002 | CN |
1454329 | Nov 2003 | CN |
1466005 | Jan 2004 | CN |
1487332 | Apr 2004 | CN |
1696788 | Nov 2005 | CN |
1823292 | Aug 2006 | CN |
1826553 | Aug 2006 | CN |
1866112 | Nov 2006 | CN |
2872404 | Feb 2007 | CN |
1307481 | Mar 2007 | CN |
101029975 | Sep 2007 | CN |
101049028 | Oct 2007 | CN |
200983052 | Nov 2007 | CN |
101114080 | Jan 2008 | CN |
101142823 | Mar 2008 | CN |
100449353 | Jan 2009 | CN |
101364004 | Feb 2009 | CN |
101598863 | Dec 2009 | CN |
100591141 | Feb 2010 | CN |
101660689 | Mar 2010 | CN |
102147079 | Aug 2011 | CN |
202486493 | Oct 2012 | CN |
1910399 | May 2013 | CN |
104133292 | Nov 2014 | CN |
204740413 | Nov 2015 | CN |
209171779 | Jul 2019 | CN |
0653891 | May 1995 | EP |
0721131 | Jul 1996 | EP |
0830984 | Mar 1998 | EP |
0833183 | Apr 1998 | EP |
0860729 | Aug 1998 | EP |
0939273 | Sep 1999 | EP |
0656555 | Mar 2003 | EP |
2003394 | Dec 2008 | EP |
1394593 | Jun 2010 | EP |
2451180 | May 2012 | EP |
1634119 | Aug 2012 | EP |
2405542 | Feb 2005 | GB |
2418518 | Mar 2006 | GB |
2428100 | Jan 2007 | GB |
2482065 | Jan 2012 | GB |
2486935 | Sep 2013 | GB |
H01130783 | Sep 1989 | JP |
H08211334 | Aug 1996 | JP |
H08237691 | Sep 1996 | JP |
H08254617 | Oct 1996 | JP |
H08070475 | Dec 1996 | JP |
H08340556 | Dec 1996 | JP |
2000048618 | Feb 2000 | JP |
2000200049 | Jul 2000 | JP |
2001093321 | Apr 2001 | JP |
2001281456 | Oct 2001 | JP |
2002049004 | Feb 2002 | JP |
2003215349 | Jul 2003 | JP |
2003215705 | Jul 2003 | JP |
2004319364 | Nov 2004 | JP |
2005116266 | Apr 2005 | JP |
2005135844 | May 2005 | JP |
2005183030 | Jul 2005 | JP |
2005259361 | Sep 2005 | JP |
2006004877 | Jan 2006 | JP |
2006031941 | Feb 2006 | JP |
2006310269 | Nov 2006 | JP |
3968742 | Aug 2007 | JP |
H3968742 | Aug 2007 | JP |
2007273288 | Oct 2007 | JP |
2007286652 | Nov 2007 | JP |
2008204874 | Sep 2008 | JP |
2010160527 | Jul 2010 | JP |
20110216281 | Oct 2011 | JP |
2013015619 | Jan 2013 | JP |
2013502693 | Jan 2013 | JP |
2013540083 | Oct 2013 | JP |
20030064258 | Jul 2003 | KR |
20090932304 | Dec 2009 | KR |
20110006773 | Jan 2011 | KR |
20110017918 | Feb 2011 | KR |
20110067534 | Jun 2011 | KR |
20120011228 | Feb 2012 | KR |
20120048301 | May 2012 | KR |
20120049890 | May 2012 | KR |
20130002646 | Jan 2013 | KR |
20140139730 | Dec 2014 | KR |
101990286 | Jun 2019 | KR |
200528780 | Sep 2005 | TW |
M537663 | Mar 2017 | TW |
1994006249 | Apr 1994 | WO |
1995020811 | Aug 1995 | WO |
1995027915 | Oct 1995 | WO |
1998021620 | May 1998 | WO |
1999011074 | Mar 1999 | WO |
2001027528 | Apr 2001 | WO |
2001061241 | Aug 2001 | WO |
2001079923 | Oct 2001 | WO |
2005071449 | Aug 2005 | WO |
2010021926 | Feb 2010 | WO |
2011020962 | Feb 2011 | WO |
2011022342 | Feb 2011 | WO |
2011068907 | Jun 2011 | WO |
2011149739 | Dec 2011 | WO |
2012158574 | Nov 2012 | WO |
2014011328 | Jan 2014 | WO |
2015040776 | Mar 2015 | WO |
2015057625 | Apr 2015 | WO |
2015143227 | Sep 2015 | WO |
2015157184 | Oct 2015 | WO |
2015190311 | Dec 2015 | WO |
2018035492 | Feb 2018 | WO |
2018208618 | Nov 2018 | WO |
2019055755 | Mar 2019 | WO |
2019067846 | Apr 2019 | WO |
2019147762 | Aug 2019 | WO |
Entry |
---|
CN-201380026046.4 Chinese 1st Office Action of the State Intellectual Property Office of P.R. China dated Oct. 24, 2016. |
CN-201380026058.7 Chinese 1st Office Action of the State Intellectual Property Office of P.R. China dated Nov. 2, 2016. |
CN-201380063047.6 Chinese Office Action of the State Intellectual Property Office of P.R. China dated Oct. 9, 2016. |
EP-11842021.5 Office Action dated Sep. 2, 2016. |
EP-13790775.4 Office Action dated Aug. 29, 2016. |
EP-13791437.0 European first office action dated Aug. 30, 2016. |
EP-14754859.8 European Extended Search Report of European Patent Office dated Oct. 14, 2016. |
3M™ ePrivacy Filter software professional version; http://www.cdw.com/shop/products/3M-ePrivacy-Filter-software-professional-version/3239412.aspx?cm_mmc=ShoppingFeeds-_-ChannelIntelligence-_-Software-_-3239412_3MT%20ePrivacy%20Filter%20software%20professional%20version_3MF-EPFPRO&cpncode=37-7582919&srccode=cii_10191459#PO; Copyright 2007-2016. |
CN-200980150139.1 1st Office Action dated Nov. 2, 2014. |
CN-200980150139.1 2nd Office Action dated May 4, 2015. |
EP16860628.3 European Search Report dated Apr. 26, 2019. |
Brudy et al., “Is Anyone Looking? Mitigating Shoulder Surfing on Public Displays through Awareness and Protection”, Proceedings of the International Symposium on Pervasive Displays, Jun. 4, 2014, XP055511160, pp. 1-6. |
Lipton, “Stereographics: Developers' Handbook”, Stereographic Developers Handbook, Jan. 1, 1997, XP002239311, p. 42-49. |
Marjanovic, M.,“Interlace, Interleave, and Field Dominance,” http://www.mir.com/DMG/interl.html, pp. 1-5 (2001). |
PCT/DE98/02576 International search report and written opinion of international searching authority dated Mar. 4, 1999 (WO99/11074). |
PCT/US2007/85475 International preliminary report on patentability dated May 26, 2009. |
PCT/US2007/85475 International search report and written opinion dated Apr. 10, 2008. |
PCT/US2009/060686 international preliminary report on patentability dated Apr. 19, 2011. |
PCT/US2009/060686 international search report and written opinion of international searching authority dated Dec. 10, 2009. |
PCT/US2011/061511 International Preliminary Report on Patentability dated May 21, 2013. |
PCT/US2011/061511 International search report and written opinion of international searching authority dated Jun. 29, 2012. |
PCT/US2012/037677 International search report and written opinion of international searching authority dated Jun. 29, 2012. |
PCT/US2012/042279 International search report and written opinion of international searching authority dated Feb. 26, 2013. |
PCT/US2012/052189 International search report and written opinion of the international searching authority dated Jan. 29, 2013. |
PCT/US2013/041192 International search report and written opinion of international searching authority dated Aug. 28, 2013. |
PCT/US2013/041228 International search report and written opinion of international searching authority dated Aug. 23, 2013. |
PCT/US2013/041235 International search report and written opinion of international searching authority dated Aug. 23, 2013. |
PCT/US2013/041237 International search report and written opinion of international searching authority dated May 15, 2013. |
PCT/US2013/041548 International search report and written opinion of international searching authority dated Aug. 27, 2013. |
PCT/US2013/041619 International search report and written opinion of international searching authority dated Aug. 27, 2013. |
PCT/US2013/041655 International search report and written opinion of international searching authority dated Aug. 27, 2013. |
PCT/US2013/041683 International search report and written opinion of international searching authority dated Aug. 27, 2013. |
PCT/US2013/041697 International search report and written opinion of international searching authority dated Aug. 23, 2013. |
PCT/US2013/041703 International search report and written opinion of international searching authority dated Aug. 27, 2013. |
PCT/US2013/049969 International search report and written opinion of international searching authority dated Oct. 23, 2013. |
PCT/US2013/063125 International search report and written opinion of international searching authority dated Jan. 20, 2014. |
PCT/US2013/063133 International search report and written opinion of international searching authority dated Jan. 20, 2014. |
PCT/US2013/077288 International search report and written opinion of international searching authority dated Apr. 18, 2014. |
PCT/US2014/017779 International search report and written opinion of international searching authority dated May 28, 2014. |
PCT/US2014/042721 International search report and written opinion of international searching authority dated Oct. 10, 2014. |
PCT/US2014/057860 International Preliminary Report on Patentability dated Apr. 5, 2016. |
PCT/US2014/057860 International search report and written opinion of international searching authority dated Jan. 5, 2015. |
PCT/US2014/060312 International search report and written opinion of international searching authority dated Jan. 19, 2015. |
PCT/US2014/060368 International search report and written opinion of international searching authority dated Jan. 14, 2015. |
PCT/US2014/065020 International search report and written opinion of international searching authority dated May 21, 2015. |
PCT/US2015/000327 International search report and written opinion of international searching authority dated Apr. 25, 2016. |
PCT/US2015/021583 International search report and written opinion of international searching authority dated Sep. 10, 2015. |
PCT/US2015/038024 International search report and written opinion of international searching authority dated Dec. 30, 2015. |
PCT/US2016/027297 International search report and written opinion of international searching authority dated Jul. 26, 2017. |
PCT/US2016/027350 International search report and written opinion of the international searching authority dated Jul. 25, 2016. |
PCT/US2016/034418 International search report and written opinion of the international searching authority dated Sep. 7, 2016. |
Robinson et al., U.S. Appl. No. 14/751,878 entitled “Directional privacy display” filed Jun. 26, 2015. |
Robinson et al., U.S. Appl. No. 15/097,750 entitled “Wide angle imaging directional backlights” filed Apr. 13, 2016. |
Robinson et al., U.S. Appl. No. 15/098,084 entitled “Wide angle imaging directional backlights” filed Apr. 13, 2016. |
Robinson, U.S. Appl. No. 13/300,293 entitled “Directional flat illuminators” filed Nov. 18, 2011. |
RU-2013122560 First office action dated Jan. 1, 2014. |
RU-2013122560 Second office action dated Apr. 10, 2015. |
Tabiryan et al., “The Promise of Diffractive Waveplates,” Optics and Photonics News, vol. 21, Issue 3, pp. 40-45 (Mar. 2010). |
Travis, et al. “Backlight for view-sequential autostereo 3D”, Microsoft E&DD Applied Sciences, (date unknown), 25 pages. |
Travis, et al. “Collimated light from a waveguide for a display,” Optics Express, vol. 17, No. 22, pp. 19714-19719 (2009). |
Williams S P et al., “New Computational Control Techniques and Increased Understanding for Stereo 3-D Displays”, Proceedings of SPIE, SPIE, US, vol. 1256, Jan. 1, 1990, XP000565512, p. 75, 77, 79. |
Robinson et al., U.S. Appl. No. 14/186,862 entitled “Directional Backlight” filed Feb. 21, 2014. |
Robinson et al., U.S. Appl. No. 62/167,203 entitled “Wide angle imaging directional backlights” filed May 27, 2015. |
Cootes et al., “Active Appearance Models”, IEEE Trans. Pattern Analysis and Machine Intelligence, 23(6):681-685, 2001. |
Bar-Shalom et al., “Multitarget-Multisensor Tracking: Principles and Techniques”, IEEE Aerospace and Electronic Systems Magazine, 1995. |
Simonyan et al., “Very Deep Convolutional Networks for Large-Scale Image Recognition”, ICLR 2015. |
AU-2011329639 Australia Patent Examination Report No. 1 dated Mar. 6, 2014. |
AU-2013262869 Australian Office Action of Australian Patent Office dated Feb. 22, 2016. |
AU-2015258258 Australian Office Action of Australian Patent Office dated Jun. 9, 2016. |
Bahadur, “Liquid crystals applications and uses,” World Scientific, vol. 1, pp. 178 (1990). |
CA-2817044 Canadian office action dated Jul. 14, 2016. |
CN-201180065590.0 Office first action dated Dec. 31, 2014. |
CN-201180065590.0 Office second action dated Oct. 21, 2015. |
CN-201180065590.0 Office Third action dated Jun. 6, 2016. |
CN-201280034488.9 2d Office Action from the State Intellectual Property Office of P.R. China dated Mar. 22, 2016. |
CN-201280034488.9 1st Office Action from the State Intellectual Property Office of P.R. China dated Jun. 11, 2015. |
CN-201380026045.X Chinese First Office Action of Chinese Patent Office dated Aug. 29, 2016. |
CN-201380026047.9 Chinese 1st Office Action of the State Intellectual Property Office of P.R. dated Dec. 18, 2015. |
CN-201380026047.9 Chinese 2d Office Action of the State Intellectual Property Office of P.R. dated Jul. 12, 2016. |
CN-201380026050.0 Chinese 1st Office Action of the State Intellectual Property Office of P.R. dated Jun. 3, 2016. |
CN-201380026059.1 Chinese 1st Office Action of the State Intellectual Property Office of P.R. dated Apr. 25, 2016. |
CN-201380026076.5 Office first action dated May 11, 2016. |
CN-201380049451.8 Chinese Office Action of the State Intellectual Property Office of P.R. dated Apr. 5, 2016. |
CN-201380063055.0 Chinese 1st Office Action of the State Intellectual Property Office of P.R. dated Jun. 23, 2016. |
CN-201480023023.2 Office action dated Aug. 12, 2016. |
EP-07864751.8 European Search Report dated Jun. 1, 2012. |
EP-07864751.8 Supplementary European Search Report dated May 29, 2015. |
EP-09817048.3 European Search Report dated Apr. 29, 2016. |
EP-11842021.5 Office Action dated Dec. 17, 2014. |
EP-11842021.5 Office Action dated Oct. 2, 2015. |
EP-13758536.0 European Extended Search Report of European Patent Office dated Feb. 4, 2016. |
EP-13790013.0 European Extended Search Report of European Patent Office dated Jan. 26, 2016. |
EP-13790141.9 European Extended Search Report of European Patent Office dated Feb. 11, 2016. |
EP-13790195.5 European Extended Search Report of European Patent Office dated Mar. 2, 2016. |
EP-13790267.2 European Extended Search Report of European Patent Office dated Feb. 25, 2016. |
EP-13790274.8 European Extended Search Report of European Patent Office dated Feb. 8, 2016. |
EP-13790775.4 European Extended Search Report of European Patent Office dated Oct. 9, 2015. |
EP-13790809.1 European Extended Search Report of European Patent Office dated Feb. 16, 2016. |
EP-13790942.0 European Extended Search Report of European Patent Office dated May 23, 2016. |
EP-13791332.3 European Extended Search Report of European Patent Office dated Feb. 1, 2016. |
EP-13791437.0 European Extended Search Report of European Patent Office dated Oct. 14, 2015. |
EP-13822472.0 European Extended Search Report of European Patent Office dated Mar. 2, 2016. |
EP-13843659.7 European Extended Search Report of European Patent Office dated May 10, 2016. |
EP-13844510.1 European Extended Search Report of European Patent Office dated May 13, 2016. |
EP-13865893.5 European Extended Search Report of European Patent Office dated Oct. 4, 2016. |
EP-16150248.9 European Extended Search Report of European Patent Office dated Jun. 16, 2016. |
Ian Sexton et al: “Stereoscopic and autostereoscopic display-systems”,—IEEE Signal Processing Magazine, May 1, 1999 (May 1, 1999 ), pp. 85-99, XP055305471, Retrieved from the Internet: RL:http://ieeexplore.ieee.org/iel5/79/16655/00768575.pdf [retrieved on Sep. 26, 2016]. |
JP-2009538527 Reasons for rejection dated Jul. 17, 2012 with translation. |
JP-200980150139.1 1st Office Action dated Feb. 11, 2014. |
JP-200980150139.1 2d Office Action dated Apr. 5, 2015. |
JP-2013540083 Notice of reasons for rejection of Jun. 30, 2015. |
JP-2013540083 Notice of reasons for rejection with translation dated Jun. 21, 2016. |
Kalantar, et al. “Backlight Unit With Double Surface Light Emission,” J. Soc. Inf. Display, vol. 12, Issue 4, pp. 379-387 (Dec. 2004). |
KR-20117010839 1st Office action (translated) dated Aug. 28, 2015. |
KR-20117010839 2d Office action (translated) dated Apr. 28, 2016. |
Languy et al., “Performance comparison of four kinds of flat nonimaging Fresnel lenses made of polycarbonates and polymethyl methacrylate for concentrated photovoltaics”, Optics Letters, 36, pp. 2743-2745. |
Adachi, et al. “P-228L: Late-News Poster: Controllable Viewing-Angle Displays using a Hybrid Aligned Nematic Liquid Crystal Cell”, ISSN, SID 2006 Digest, pp. 705-708. |
Brudy et al., “Is Anyone Looking? Mitigating Shoulder Surfing on Public Displays through Awareness and Protection”, Proceedings of the International Symposium on Persuasive Displays (Jun. 3, 2014), pp. 1-6. |
CN201780030715.3 Notification of the First Office Action dated Jan. 21, 2020. |
EP-16860628.3 Extended European Search Report of European Patent Office dated Apr. 26, 2019. |
EP-17799963.8 Extended European Search Report of European Patent Office dated Oct. 9, 2019. |
Gass, et al. “Privacy LCD Technology for Cellular Phones”, Sharp Laboratories of Europe Ltd, Mobile LCD Group, Feb. 2007, pp. 45-49. |
Ishikawa, T., “New Design for a Highly Collimating Turning Film”, SID 06 Digest, pp. 514-517. |
PCT/US2016/058695 International search report and written opinion of the international searching authority dated Feb. 28, 2017. |
PCT/US2017/032734 International search report and written opinion of the international searching authority dated Jul. 27, 2017. |
PCT/US2018/031206 International search report and written opinion of the international searching authority dated Jul. 20, 2018. |
PCT/US2018/031218 International Preliminary Report on Patentability dated Nov. 21, 2019. |
PCT/US2018/031218 International search report and written opinion of the international searching authority dated Jul. 19, 2018. |
PCT/US2018/051021 International search report and written opinion of the international searching authority dated Nov. 21, 2018. |
PCT/US2018/051027 International search report and written opinion of the international searching authority dated Nov. 30, 2018. |
PCT/US2018/053328 International search report and written opinion of the international searching authority dated Nov. 30, 2018. |
PCT/US2018/059249 International search report and written opinion of the international searching authority dated Jan. 3, 2019. |
PCT/US2018/059256 International search report and written opinion of the international searching authority dated Jan. 3, 2019. |
PCT/US2019/014889 International search report and written opinion of the international searching authority dated May 24, 2019. |
PCT/US2019/014902 International search report and written opinion of the international searching authority dated Jun. 25, 2019. |
PCT/US2019/023659 International search report and written opinion of the international searching authority dated Jun. 10, 2019. |
PCT/US2019/038409 International search report and written opinion of the international searching authority dated Sep. 19, 2019. |
PCT/US2019/038466 International search report and written opinion of the international searching authority dated Nov. 5, 2019. |
PCT/US2019/042027 International search report and written opinion of the international searching authority dated Oct. 15, 2019. |
PCT/US2019/054291 International search report and written opinion of the international searching authority dated Jan. 6, 2020. |
PCT/US2019/059990 International search report and written opinion of the international searching authority dated Feb. 28, 2020. |
PCT/US2019/066208 International search report and written opinion of the international searching authority dated Feb. 27, 2020. |
PCT/US2020/017537 International search report and written opinion of the international searching authority dated Apr. 29, 2020. |
PCT/US2020/040686 International search report and written opinion of the international searching authority dated Nov. 20, 2020. |
PCT/US2020/044574 International search report and written opinion of the international searching authority dated Oct. 21, 2020. |
Weindorf et al., “Active Circular Polarizer OLED E-Mirror”, Proceedings of the Society for Information Display 25th Annual Symposium of Vehicle Displays, Livonia, MI, pp. 225-237, Sep. 25-26, 2018. |
PCT/US2020/053863 International search report and written opinion of the international searching authority dated Mar. 12, 2021. |
PCT/US2020/060155 International search report and written opinion of the international searching authority dated Feb. 5, 2021. |
PCT/US2020/060191 International search report and written opinion of the international searching authority dated Feb. 8, 2021. |
PCT/US2020/063638 International search report and written opinion of the international searching authority dated Mar. 2, 2021. |
PCT/US2020/064633 International search report and written opinion of the international searching authority dated Mar. 15, 2021. |
Robson, et al. “Spatial and temporal contrast-sensitivity functions of the visual system”, J. Opt. Soc. Amer., vol. 56, pp. 1141-1142 (1966). |
Number | Date | Country | |
---|---|---|---|
20200098342 A1 | Mar 2020 | US |
Number | Date | Country | |
---|---|---|---|
62246584 | Oct 2015 | US | |
62261151 | Nov 2015 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15334023 | Oct 2016 | US |
Child | 16596957 | US |