The present invention generally relates to communication technologies. More specifically, the present invention concerns a projection system, a projector and a projection method for providing a distributed manifestation in an environment.
Projectors are used for a wide variety of applications, such as light shows or animations for music concerts and other live events, corporate presentations, video conferences, home theaters, etc. Typically, a video projector receives a video signal and projects an image corresponding to the signal onto a surface, using a lens system.
Video projector technologies include LCD (Liquid Crystal Display), DLP (Digital Light Processing), LCoS (Liquid Crystal on Silicon), LED (Light Emitting Diode) and Laser Diode.
It is further known to create light animations by modifying the color of a plurality of modular elements in response to IR signals sent by a remote control. It is also known in the art to change the state of a plurality of modular elements using a distribution panel to which the module elements are connected.
An example of such system was developed by the Responsive Environments Group at the MIT Media Lab and is known as “push pin computing”. This system includes a hardware and software platform for experimenting and prototyping algorithms for distributed sensor networks. The platform consists of approximately 100 nodes of inhabiting a substrate of predetermined dimensions. The system is described more in details on http://web.media.mit.edu/˜lifton/research/pushpin/index.html.
Another known system is shown on the web site of the design studio of Ziagelbaum and Coelho (http://zigelbaumcoelho.com/six-forty-by-four-eighty/). Modular elements provided with LEDs react when touched by lighting up, changing color, blinking, etc. They can also be activated by an IR remote control.
Yet another known way of creating a light animation consists of using balloons linked together in a giant mesh. Each balloon is provided with electronic components and LEDs. The LEDs of each balloon are controlled via a console located on a handlebar. The handlebar includes several consoles, each console allowing to control a group of balloons, the console being linked to the electronic components of the balloons. The web site http://www.haque.co.uk/openburble.php provides details on this type of lighted animation.
A similar light animation consists of animating a giant mesh of balloons using cell phones. The mesh includes one thousand helium balloons and several dozen mobile phones. The balloons contain miniature sensor circuits and LEDs that respond to electromagnetic fields, such as those of mobile phones. When activated, the sensor circuits of each balloon communicate with one another, causing the LEDs of the entire mesh to illuminate. More information on this type of animation can be found on “http://www.haque.co.uk/skyear/information.html”.
These systems provide striking and spectacular animations. However, the complexity of the resulting animations is limited, since the lighted elements are not centrally controlled or activated. In addition, the synchronization of all elements requires them to be physically linked to one another, for example via a panel, as in the “six-fourty by four-eighty” installation of Ziglebaum and Coehlo, or via a mesh, as in the SkyEar installation of Haque. This limits the mobility of the lighted elements within a given environment.
In light of the above, there remains a need for systems and methods for providing a distributed manifestation in an environment which alleviates at least some of the drawbacks of the prior art.
In accordance with a first aspect of the present invention, a projection system is provided. The projection system is for providing a distributed manifestations within an environment. The projection system comprises a data generator, a projector and a plurality of receiving units distributed within the environment.
The data generator for generating a plurality of data sets of associated state data and spatial coordinate data. The projector is in communication with the data generator for receiving the data sets therefrom. It comprises a signal generating module for generating a plurality of electromagnetic signals, each one of the electromagnetic signals being representative of the state data from one of the data sets. The projector also includes a projecting module for projecting each of the electromagnetic signals towards a target location within the environment. Each target location corresponds to the spatial coordinate data associated to the state data transmitted by the electromagnetic signal.
The plurality of receiving units is distributed within the environment. Each receiving unit is provided with a receiver for receiving one of the electromagnetic signals when the receiving unit is positioned in the corresponding target location. Each of the receiving units is also adapted to perform a change of state in response to the state data.
In accordance with another aspects of the invention, there is provided a projector for providing a distributed manifestation within an environment through a plurality of receiving units. The receiving units are adapted to perform a change of state and are positioned at target locations within the environment. The distributed manifestation is based on a plurality of data sets of associated state data and spatial coordinate data.
The projector first includes a signal generating module for generating a plurality of electromagnetic signals, and encoding each one of these electromagnetic signals with the state data from one of the data sets. Encoded electromagnetic signals are thereby obtained. The projector further includes a projecting module for projecting each of the encoded electromagnetic signals towards one of the target locations within the environment corresponding to the spatial coordinate data associated to the state data encoded within the electromagnetic signal.
Preferably, the projector is provided with an encoder and the receiving units are each provided with a decoder. Preferably, the encoder is a modulator and the decoders are demodulators. Still preferably, the state data is representative of a video stream and the receiving elements are provided with LEDs.
In accordance with yet another aspect of the present invention, a method is provided. The method comprises the steps of:
Advantageously, the present invention allows updating individually a plurality of receiving units with a wireless technology in order to create a manifestation, for example a visual animation. Embodiments of the invention may advantageously provide systems for displaying or animating elements by controlling or animating them from at least one centralized source. Control of these elements in function of their locations within a given space may also be provided, while not limiting their displacement within this space. Embodiments may also provide the capability of wirelessly updating the modular elements dispersed within the given space.
Other objects, advantages and features of the present invention will become more apparent upon reading the following non-restrictive description of preferred embodiments thereof, given for the purpose of exemplification only, with reference to the accompanying drawings in which:
In the following description, similar features in the drawings have been given similar reference numerals. In order to preserve clarity, certain elements may not be identified in some figures if they are already identified in a previous figure.
In accordance with a first aspect thereof, the present invention generally concerns a projecting system for creating am manifestation using a projector and several receiving units distributed within a given environment. Electromagnetic signals are sent by the projector and may vary in function of specific locations targeted by the projector. In other words, receiving units located within a target location of the environment will receive specific electromagnetic signals. These signals will include a state data, instructing the receiving element on a change of state they need to perform. The change of state can be for example a change of color. The combined effect of the receiving units will provide a manifestation, each unit displaying a given state according to its location.
The expression “manifestation” is used herein to refer to any physical phenomena which could take place within the environment. In the illustrated embodiments, the manifestation is a visual animation, such as a change in color, video, or simply the presence or absence of light or an image. The present invention is however not limited to visual animations and could be used to provide other types of manifestations such as sound, shape or odor.
The environment could be embodied by any physical space in which the manifestation takes place. Examples of such environments are infinite: the architectural surface of a public space, a theatre, a hall, a museum, a field, a forest, a city street or even the ocean or the sky. The environment need not be bound by physical structures and may only be limited by the range of propagation of the electromagnetic signals generated by the system, as will be explained in detail further below.
The receiving units can be dispersed in any appropriate manner within the environment. At any given time, the receiving unit may define a 2D or a 3D manifestation. The manifestation within the environment may be fixed for any given period of time, or dynamic, changing in real-time or being perceived to do so. The distribution of receiving elements within the environment may also be either fixed or dynamic, as will be apparent from the examples given further below.
Referring to
Components of projection systems according to embodiments of the invention will be described in the following section.
Data Generator
The data generator 14 can be a computer, a data server or any type of device provided with memory 60 and a processor 62, able to store and transmit data to the projector 22. In operation, the data generator 14 generates a plurality of data sets 16, for example taking the form of data structures, such as the one illustrated in
The term “state” refers to a mode or a condition which can be displayed or expressed by a receiving unit. For example, a state can take the form of a visual manifestation, such as a color, a level of intensity and/or opacity. The state can also relate to a sound, an odor or a shape. It can be a sequence of state changes in time. For example, the state data can be representative of a video stream, the distributed manifestation displayed by the receiving units 32 being a video, each receiving unit 32 thus becoming a pixel within a giant screen formed by the plurality of units 32.
In order for the projector 22 to address specific receiving units 32 within the plurality of units, the state data 18 is associated with spatial coordinate data 20. The term spatial coordinate refers to a coordinate which may take various forms such as for example a position in an array of data, a location within a table, the position of a switch in a matrix addresser, a physical location, etc.
Projector
Now with reference to
The projector 22 is in communication with the data generator 14 and receives the data sets therefrom. While in
The projector first includes a signal generating module 24 for generating electromagnetic signals 26 including of the state data contained in the data sets received. In other words, each electromagnetic signal 26 generated by the module 24 is representative of a specific state data 18 contained in a corresponding data set 16.
In this one embodiment, the electromagnetic signals 26 have a wavelength within the infrared spectrum. Other wavelengths may be considered without departing from the scope of the present invention.
The signal generating module 24 preferably includes one or more light emitters 40. Each light emitter 40 generates corresponding electromagnetic signals 26. The wavelength of the electromagnetic signals may be in the infrared, the visible or the ultraviolet spectrum, and the signal generating module 24 can include light emitters 40 generating electromagnetic signals at different wavelengths. The electromagnetic signals 26 may be monochromatic or quasi-monochromatic or have a more complex spectrum. For example, the light emitters may be embodied by lamps, lasers, LEDs or any other device apt to generate light having the desired wavelength or spectrum.
Referring more specifically to
With reference to the embodiment of
The modulation can be either an analog or a digital modulation. The modulator 42 preferably generates a modulation signal having an amplitude, a frequency and a phase, each of these parameters being possibly controllable to perform the desired modulation. The projector 22 can include modulators 42 to modulate the signal of light emitter 40 in one or in a combination of modulation methods. Modulation techniques such amplitude modulation, frequency modulation, phase modulation, phase-shift keying, frequency-shift keying, on-off keying, spread-spectrum and combinations of these techniques are well known to those skilled in the art.
In other embodiments, the encoder may be embodied by other types of devices which act on the electromagnetic signals 26, a filter and/or shutter 22 placed in front of the radiation emitters 40. Both encoding methods may be used in conjunction.
As explained previously, a projector 22 can be provided with a single light emitter 40 or combinations of emitters 40 in order to communicate with receiving units 32 using many wavelengths concurrently. Similarly, the signal of a light emitter 40 can be modulated sequentially or concurrently in different ways to communicate different information. Both methods can be used concurrently. In other words, the projector 22 can project the electromagnetic signals 26 successively or in parallel, at least for some of the signals.
For example, a projector 22 can modulate the signal 26 of an infrared emitter 40 at three different frequencies in order to transmit state data 18 on three independent channels. Receiving units 32 equipped with amplifiers and/or demodulators tuned to these three frequencies may then change state according to the signal they receive on three independent channels. For example, using red, green and blue LEDs coupled to each of these three associated state color allows the units 32 to display full-color video in real-time.
Referring still to
Various devices and configurations may be used in the projecting module to spatially direct electromagnetic signals in various directions.
Referring to
Referring to
The processing unit 48 thus controls the operation of the matrix addresser 44 according to the spatial coordinate data 20 sent by the data generator 14. This spatial coordinate data 20 can correspond for example to the location of a specific micro-mirror within a Digital Micromirror Device (DMD). The spatial coordinate data 20 can also refer to a specific group of micro-mirrors within such a projector 22. In other embodiments of the invention where the projector is moved by a controllable motorized unit, the spatial coordinate data would correspond a position of the projector within a given referential.
Using a matrix addresser 44 provides the advantage of being extremely precise for addressing units 32 independently. The micro-mirrors array and liquid-crystal matrix systems also have the benefit of being able to concurrently address the receiving units 32, while a scanning mirror system is limited to sequential addressing. Alternatively, both types of control, successive or parallel, may be combined in a single system, as for example shown in
The state data may be encoded in the electromagnetic signals projected by the projector through modulation. In other variants, the state data may be defined by the absence or presence of an electromagnetic signal. For example, in the embodiment shown in
It will be readily understood that the projector may further include any number of optical components as required by the particular design of the device in order to further shape, direct or focus the electromagnetic light signals. A light directing element 66, or a beam shaper, such as a lens or a collimator, may be placed frontward of the matrix addresser 44. In some embodiments, the projector may for example include one or more collimators and/or a beam shaper for shaping the electromagnetic signals 26 into a desired pattern in order to address a specific group of units 32 dispersed in the environment, in a similar fashion as when a gobo lamp projector is used.
The projector 22 may also include a wireless receiver 58 in order to receive feedback signals 27 from a wireless transmitter 56 of the receiving units 32 (shown in
One skilled in the art will readily understand that the various illustrated combinations of the components of the projectors illustrated in
Receiving Units
Best shown in
Referring back to
While the plurality of receiving units shown in
With reference to
Various types of components can be envisaged. State changing component 54 can include light emitting, light reflecting or light filtering members such as, but not limited to, light-emitting diodes (LEDs), organic LEDs, quantum dots, incandescent lights, neons, liquid crystal displays (LCD), plasma displays, electronic paper displays, electrochromic displays, thermo-chromic displays, electro mechanically-actuated light filters, electroluminescent elements and phosphorescent members. In other types of manifestations the change of state may also take the form of sound, shape, motion, odor, texture, and is therefore not restricted to visible changes.
The receiving unit 32 may be embodied by any device able to receive a data signal 26 from a projector 22, and to change of state in response to the received signal 26. For example, but not limited to, the receiving unit 32 can consist of a mobile phone, a digital media player or a watch provided with an electromagnetic detector such as a camera, a light sensor, an infrared receiver, usable in conjunction with a projector and adapted to change of state. In another example, the receiving units may be an array of speaker distributed within a given space and wirelessly updated using the projecting system 10 of the invention. In another embodiment of the invention, the receiving units can be motorized puppets distributed to visitors in a park, each puppet taking different facial expressions or emitting different fragrances depending on where the visitor is located in the park. Yet another example includes a board game where each puck on the board game changes shape depending on its location on the board using electrostrictive actuators or electromagnets.
The receiving units 32 are preferably further adapted to receive, decode and express state commands which do not necessarily induce a physical manifestation. A receiving unit 32 may be sent other types of data in addition to a state data, from the projector 22. Non restrictive examples can be commands to switch the receiving units 22 into a low-power consumption mode (sleep), grouping data for grouping units into various sub-groups, program update data for programs saved within a receiving unit, current geo-location of the receiving units, and the like.
With reference to
In addition, yet in other embodiments, the units 32 can include more than one of these signal processing elements. For example, a unit may be provided with two receivers 34, each able to receive a signal at a specific wavelength. Associated with each of these receivers 34, a state changing component 54 can be provided. When detecting a signal at first given wavelength the unit 32 can blink and when detecting a signal at another given wavelength the unit 32 can vibrate. The digital and sensor inputs 80, 82 can be used to modify the state of the state changing component 54. For the sake of clarity not all elements are linked to the power source 64, although it is understood that elements requiring power are linked to a power source.
Still referring to
With reference to
Thus, a cluster of such units 2 can form an LCD array which can be used to display static images, animations or video on an area larger than the area from an individual LCD.
Of course, the shape and size of the individual receiving units 32 can vary within the plurality of units 32. In other words, each receiving unit 32 can have a shape different from the rest of the units 32. The units 32 can take different shapes, be made of different materials and have different types of physical and/or digital manifestations mechanisms within a group of units 32.
Still other examples of receiving units can include different types of digital components such as: memory card readers; USB ports; discrete sensors; momentary push buttons; tilt switches; continuous sensors such as microphones and accelerometers. These types of components advantageously allow users to interact with the projection system, and thus with other units of the system. For example, some of the receiving units 32 can be provided with microphones, allowing the units 32 to autonomously control their state, in addition to change state in response to signals sent by the projector 22.
Projection Method
According to another aspect of the invention, there is also provided a method for providing a distributed manifestation within an environment.
With reference to
In the following step, the signal generating module 24 generates a plurality of electromagnetic signals, each being representative of the state data from one of the data sets.
Next, each of the electromagnetic signals is projected by the projecting module 28 towards a target location within an environment, the target location corresponding to the spatial coordinate data associated with the state data transmitted by the electromagnetic signal.
A plurality of receiving units is distributed within the environment. At each of the target location where a receiving unit 32 is positioned, the corresponding electromagnetic signal is received on the receiver 34 of the unit 32, the state changing component 54 changing of state in response to the state data of the signal received.
Preferably, and with reference to
Now referring to
According to the data received, the projectors 22a, 22b send different signals 26 depending of the location towards which their beam is directed. In this example, some of the micro-mirrors of the projector 22a project signals 26 towards target location 30a, the state data of these signals 26a instructing the receiving units to light-up a blue LED. Simultaneously, other micro-mirrors of projector 22a project signals 26b towards target location 30b, the state data of these signals 26b instructing the receiving units 32 to turn on their red LED. The receivers 34 on the clothing detects the signals 26, decode the state change command embedded in the signal and transmit a command signal to the state changing component 54, triggering the LEDs to light-up or light-off.
The other projector 22b, can also simultaneously receive sets of data 16 from the laptop 14, converting the electrical signal into an electromagnetic signal 26c. The signals created will include a specific state data, for example, a blinking instruction for yellow LEDs. Each electromagnetic signal will then be directed towards a specific group of micro-mirrors, according to the coordinate information to which the state data was associated with. The projector 22c will then transmit the signals 26 to the target location 30c. As it can be appreciated, a projector 22 can simultaneously send different signals to different portions (or target locations) of the crowd. The clothing of a given individual will behave differently, depending on its location within the auditorium.
The combined effects of the lighted clothing will create a visual display within the crowd. In other words, each individual in the crowd becomes a pixel, the crowd forming a giant display allowing images to be projected on it, thanks to the receiving elements 32 they are wearing. The receiving units 32 being independent from each other, an individual can move across the room without affecting the display. The LEDs on the clothing of the individual will be lit up or not in function of the signals received, these signals being projected to specific portions of the room, without having to geographically localize the receiving units.
Of course, the projection system of the invention can have various applications, not only directed to crowd displays, but it can also be used for large area displays, dynamic camouflage, security systems, object tracking, games, etc.
Advantageously, the projection system of the invention is scalable in size and in resolution. The system is easy to deploy indifferent types of environments. The receiving units are mechanically and electrically autonomous rendering them mobile. The projection system is simple, and the projector mechanism allows precisely addressing each receiving unit within a group of units.
Of course, numerous modifications could be made to the embodiments above without departing from the scope of the present invention.
This application is a continuation of commonly assigned, co-pending U.S. patent application Ser. No. 15/261,122, entitled “Devices and Methods For Providing A Distributed Manifestation In An Environment,” filed Sep. 9, 2016, which is a continuation of commonly assigned, U.S. patent application Ser. No. 14/743,706, entitled “Devices and Methods For Providing A Distributed Manifestation In An Environment,” filed Jun. 18, 2015, now U.S. Pat. No. 9,648,707, which is a continuation of commonly assigned U.S. patent application Ser. No. 14/271,825, entitled “Devices and Methods For Providing A Distributed Manifestation In An Environment,” filed May 7, 2014, now U.S. Pat. No. 9,286,028, which is a continuation of commonly assigned U.S. patent application Ser. No. 13/801,775, entitled “Devices and Methods For Providing A Distributed Manifestation In An Environment,” filed Mar. 13, 2013, now U.S. Pat. No. 8,740,391, which is a continuation of International Application No. PCT/CA2011/000700, filed Jun. 14, 2011, entitled “Devices And Methods For Providing A Distributed Manifestation In An Environment,” which claims priority to U.S. Provisional Patent Application Ser. No. 61/449,290, filed Mar. 4, 2011, entitled “A Projecting System And A Projecting Method.” The entirety of each of the documents listed above is incorporated herein by reference.
Number | Name | Date | Kind |
---|---|---|---|
3473428 | Phillips | Oct 1969 | A |
4706006 | Solomon | Nov 1987 | A |
4729071 | Solomon | Mar 1988 | A |
4777568 | Solomon | Oct 1988 | A |
4811182 | Solomon | Mar 1989 | A |
4893225 | Solomon | Jan 1990 | A |
4897770 | Solomon | Jan 1990 | A |
4958265 | Solomon | Sep 1990 | A |
4974946 | Solomon | Dec 1990 | A |
4983031 | Solomon | Jan 1991 | A |
5934777 | Patton | Aug 1999 | A |
5986781 | Long | Nov 1999 | A |
6124862 | Boyken et al. | Sep 2000 | A |
6283614 | Okada et al. | Sep 2001 | B1 |
6404409 | Solomon | Jun 2002 | B1 |
6577080 | Lys et al. | Jun 2003 | B2 |
6624854 | Isogai et al. | Sep 2003 | B1 |
6857746 | Dyner | Feb 2005 | B2 |
6921172 | Ulichney et al. | Jul 2005 | B2 |
7161313 | Piepgras et al. | Jan 2007 | B2 |
7180475 | Slobodin | Feb 2007 | B2 |
7203524 | Tushinsky et al. | Apr 2007 | B2 |
7307541 | Ikeda et al. | Dec 2007 | B2 |
7370978 | Anderson et al. | May 2008 | B2 |
7554542 | Ferraro et al. | Jun 2009 | B1 |
7736021 | Solomon | Jun 2010 | B2 |
7798404 | Gelbman | Sep 2010 | B2 |
7863552 | Cartlidge et al. | Jan 2011 | B2 |
8033686 | Recker et al. | Oct 2011 | B2 |
8194118 | Solomon | Jun 2012 | B2 |
8312173 | Berg et al. | Nov 2012 | B2 |
8354918 | Boyer | Jan 2013 | B2 |
8502480 | Gerszberg et al. | Aug 2013 | B1 |
8628198 | Jalbout et al. | Jan 2014 | B2 |
8648541 | Gerszberg et al. | Feb 2014 | B2 |
8740391 | Leclerc et al. | Jun 2014 | B2 |
8831642 | Moldaysky et al. | Sep 2014 | B2 |
8845110 | Worley, III | Sep 2014 | B1 |
8941332 | Gerszberg | Jan 2015 | B2 |
9066383 | Gerszberg | Jun 2015 | B2 |
9286028 | Leclerc et al. | Mar 2016 | B2 |
9648707 | Leclerc et al. | May 2017 | B2 |
9686843 | Van De Sluis et al. | Jun 2017 | B2 |
9722649 | Leclerc et al. | Aug 2017 | B2 |
20020015052 | Deering | Feb 2002 | A1 |
20020118147 | Solomon | Aug 2002 | A1 |
20020140364 | Inukai | Oct 2002 | A1 |
20020199198 | Stonedahl | Dec 2002 | A1 |
20030234914 | Solomon | Dec 2003 | A1 |
20040036813 | Matsuda | Feb 2004 | A1 |
20040130783 | Solomon | Jul 2004 | A1 |
20040145709 | Colucci et al. | Jul 2004 | A1 |
20050128437 | Pingali et al. | Jun 2005 | A1 |
20050151941 | Solomon | Jul 2005 | A1 |
20060033992 | Solomon | Feb 2006 | A1 |
20060097660 | Scott et al. | May 2006 | A1 |
20060109274 | Alvarez et al. | May 2006 | A1 |
20060126336 | Solomon | Jun 2006 | A1 |
20060139750 | Solomon | Jun 2006 | A1 |
20060173701 | Gurvey | Aug 2006 | A1 |
20060279477 | Allen et al. | Dec 2006 | A1 |
20070064204 | Miyazawa et al. | Mar 2007 | A1 |
20070144047 | Singh | Jun 2007 | A1 |
20070146642 | Slobodin et al. | Jun 2007 | A1 |
20070177161 | Ishii | Aug 2007 | A1 |
20070188715 | Inazumi | Aug 2007 | A1 |
20080055246 | Okayama et al. | Mar 2008 | A1 |
20090033808 | Maeda et al. | Feb 2009 | A1 |
20090230895 | De Prycker et al. | Sep 2009 | A1 |
20100085279 | Repko | Apr 2010 | A1 |
20100128228 | Matsuo et al. | May 2010 | A1 |
20100134695 | O'Connell | Jun 2010 | A1 |
20110001881 | Kawahara | Jan 2011 | A1 |
20110007277 | Solomon | Jan 2011 | A1 |
20110057583 | Fattizzi | Mar 2011 | A1 |
20110164192 | Ozawa | Jul 2011 | A1 |
20110273278 | Kurt et al. | Nov 2011 | A1 |
20110304833 | Osaka et al. | Dec 2011 | A1 |
20120050566 | Cote et al. | Mar 2012 | A1 |
20120056799 | Solomon | Mar 2012 | A1 |
20120178471 | Kainulainen et al. | Jul 2012 | A1 |
20130065584 | Lyon et al. | Mar 2013 | A1 |
20130231760 | Rosen et al. | Sep 2013 | A1 |
20130250184 | Leclerc et al. | Sep 2013 | A1 |
20130254137 | Hunt | Sep 2013 | A1 |
20130260693 | Un et al. | Oct 2013 | A1 |
20140132181 | Gerszberg | May 2014 | A1 |
20140184386 | Regler et al. | Jul 2014 | A1 |
20140237076 | Goldman et al. | Aug 2014 | A1 |
20140240203 | Leclerc et al. | Aug 2014 | A1 |
20150281009 | Melcher et al. | Oct 2015 | A1 |
20150286458 | Leclerc et al. | Oct 2015 | A1 |
20150381793 | Cerda et al. | Dec 2015 | A1 |
20160381762 | Leclerc et al. | Dec 2016 | A1 |
20170006414 | Tomassini | Jan 2017 | A1 |
Number | Date | Country |
---|---|---|
WO 02101702 | Dec 2002 | WO |
WO 2008075944 | Jun 2008 | WO |
WO 2010116299 | Oct 2010 | WO |
WO 2014096861 | Jun 2014 | WO |
WO 2014135711 | Sep 2014 | WO |
WO 2014182161 | Nov 2014 | WO |
Entry |
---|
International Search Report and Written Opinion for Application No. PCT/CA2011/000700 dated Nov. 29, 2011. |
International Preliminary Report on Patentability for Application No. PCT/CA2011/000700 dated Mar. 8, 2013. |
Supplementary European Search Report for Application No. EP 11860520 dated Nov. 4, 2015. |
Invitation to Pay Additional Fees for Application No. PCT/CA2016/051053 dated Mar. 10, 2017. |
International Search Report and Written Opinion for Application No. PCT/CA2016/051053 dated May 15, 2017. |
Number | Date | Country | |
---|---|---|---|
20180077775 A1 | Mar 2018 | US |
Number | Date | Country | |
---|---|---|---|
61449290 | Mar 2011 | US |
Number | Date | Country | |
---|---|---|---|
Parent | 15261122 | Sep 2016 | US |
Child | 15807241 | US | |
Parent | 14743706 | Jun 2015 | US |
Child | 15261122 | US | |
Parent | 14271825 | May 2014 | US |
Child | 14743706 | US | |
Parent | 13801775 | Mar 2013 | US |
Child | 14271825 | US | |
Parent | PCT/CA2011/000700 | Jun 2011 | US |
Child | 13801775 | US |