Embodiments hereof relate generally to multimedia content processing and presentation, and in particular, to enhancing audio-video content with haptic effects.
Haptic effects, commonly used in the video gaming industry, can provide tactile cues that enhance a user experience and make a virtual environment more interactive for users. For example, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users of electronic devices to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment. A user's experience in viewing a live event, such as a sporting event, can be enhanced by adding haptic effects to the audio and video components of the live event. Embodiments hereof relate to architectures for transferring haptic data from a live event to an end user device.
Embodiments hereof relate a system for transferring haptic data from a live event to an end user device. The system includes a recorder, a sensor, a transformer, and an end user device. The recorder is configured to record video data of a live event. The sensor is configured to sense a parameter associated with the live event and output the sensed parameter as sensor data. The transformer is configured to receive the sensor data from the sensor and transform the sensor data into a signal including haptic data that is associated with the live event. The end user device is configured to receive the haptic data from the transformer and configured to receive the video data of the live event. The end user device includes a haptic output device configured to generate a haptic effect to a user based on the haptic data and includes a display configured to display the video data.
Embodiments hereof relate a system including a recorder, a sensor, an onsite transmission station, a transformer, and an end user device. The recorder is configured to record video data of a live event. The sensor is configured to sense a parameter associated with the live event and output the sensed parameter as sensor data. The onsite transmission station is located onsite to the live event. The onsite transmission station is configured to receive the video data of the live event and includes a processor for processing the video data of the live event. The transformer is configured to receive the sensor data from the sensor and transform the sensor data into a signal including haptic data that is associated with the live event. The end user device is configured to receive the haptic data from the transformer and configured to receive the processed video data from the processor of the onsite transmission station. The end user device includes a haptic output device configured to generate a haptic effect to a user based on the haptic data and includes a display configured to display the video data.
Embodiments hereof also relate to a method of transferring haptic data from a live event to an end user device. The video data of a live event is recorded. A parameter associated with the live event is sensed and the sensed parameter is output as sensor data. The sensor data from the sensor is received and the sensor data is transformed into a signal including haptic data that is associated with the live event. The haptic data is received at an end user device. The end user device includes a display configured to display the video data. The video data of the live event is received at the end user device. The end user device includes a haptic output device configured to generate a haptic effect to a user based on the haptic data. At least one haptic effect is generated with the haptic output device based on the haptic data, and the video data is displayed on the display.
The foregoing and other features and advantages of the invention will be apparent from the following description of embodiments hereof as illustrated in the accompanying drawings. The accompanying drawings, which are incorporated herein and form a part of the specification, further serve to explain the principles of the invention and to enable a person skilled in the pertinent art to make and use the invention. The drawings are not to scale.
Specific embodiments of the present invention are now described with reference to the figures, wherein like reference numbers indicate identical or functionally similar elements. The following detailed description is merely exemplary in nature and is not intended to limit the invention or the application and uses of the invention. Furthermore, there is no intention to be bound by any expressed or implied theory presented in the preceding technical field, background, brief summary or the following detailed description.
Embodiments hereof relate to architectures for transferring haptic data from a live event to an end user device in order to enable viewing of live video along with live haptic effects. Stated another way, architectures described herein are suitable to embed or include the haptic data with the live video and/or audio broadcast to enhance the user's experience. It is desirable to be able to record a video of a live event while at the same time record real-time aspects of the live event so that the real-time aspects of the event may be played back with the live video as haptic effects or sensations to provide an even more realistic and immersive experience for the user watching the live video. Recording real-time data from an object in a scene, and reproducing its experience haptically, is a more intuitive way of capturing or re-creating an experience when compared to offline artistic editing with programming tools and writing synchronization code to attempt to synchronize the haptic effects with the video. Further, recording and transforming real-world sensor data to produce haptic output is a more cost-efficient way to generate haptic content when compared to hand-authored synthesis of haptic signals.
More particularly, with reference to flow chart 100 of
In the embodiment of
In step 106 of
More particularly, in an embodiment, a single end user device may include speaker and/or display 124 as well as haptic output device 126 and may be utilized to output the processed audio and/or video data as well as the haptic data. For example, the remote user may be watching the sporting event on a mobile device such as a phone or tablet. In this embodiment, cellular or Wi-Fi technologies could be used for transmission and the visual, audio, and haptic content would be output via the same device. In another embodiment, the user may be watching the sporting event live in the stadium or arena. In this embodiment, the mobile device would provide additional immersion in the same location and would enhance the general atmosphere and setting of the game. The mobile device could be held in the user's hand, or kept in the user's pocket while still outputting haptic content.
More particularly,
As shown on the block diagram of
In operation, receiver 434 of end user device 430 receives signal 122 including haptic data or receives composite signal 328 and recognizes the haptic component thereof. Signal 122 including haptic data or the haptic component of composite signal 328 is then routed or transmitted to processor 432. Signal 122 including haptic data or the haptic component of composite signal 328 may, for example, include a direct haptic effect stream or set of commands indicating which haptic effects must be performed. In response to receiving signal 122 including haptic data or the haptic component of composite signal 328, processor 432 instructs haptic output device 426 to provide or output one or more haptic effects to a user. Processor 432 can decide what haptic effects to send to haptic output device 426 and in what order to send the haptic effects. For example, signal 122 including haptic data or the haptic component of composite signal 328 may include voltage magnitudes and durations. In another example, signal 122 including haptic data or the haptic component of composite signal 328 may provide high level commands to processor 432 such as the type of haptic effect to be output (e.g. vibration, jolt, detent, pop, etc.) by haptic output device 426, whereby the processor 432 instructs haptic output device 426 as to particular characteristics of the haptic effect which is to be output (e.g. magnitude, frequency, duration, etc.). Processor 432 may retrieve the type, magnitude, frequency, duration, or other characteristics of the haptic effect from memory 436 coupled thereto. In another embodiment, signal 122 including haptic data or the haptic component of composite signal 328 may provide a code or an identification number to processor 432 which corresponds to a haptic effect that is stored previously in the memory of processor 432.
Haptic feedback enhances the user experience. As used herein, kinesthetic effects (such as active and resistive haptic feedback) and/or tactile effects (such as vibration, texture, and heat) are known collectively as “haptic feedback” or “haptic effects.” The collective haptic effects provide the user with a greater sense of immersion to the audio-video content as multiple modalities are being simultaneously engaged, e.g., video, audio, and haptics. For example, vibration effects, or vibrotactile haptic effects, may be useful in providing cues to users to alert the user to specific events, or provide realistic feedback to create greater sensory immersion within a simulated or virtual environment. Processor 432 of end user device 430 may be running software algorithms that further tailor or tune the haptic data to optimize for the specific type of end user device 430 that is rendering the haptic data.
Haptic effects provided by haptic output device 426 may include but are not limited to transient effects such as detents or vibrations. Haptic output device 426 may be a physical and/or a non-physical actuator. Possible physical actuators include but are not limited to eccentric rotating mass (“ERM”) actuators in which an eccentric mass is moved by a motor, linear resonant actuators (“LRAs”) in which a mass attached to a spring is driven back and forth, piezoelectric actuators, electromagnetic motors in which an eccentric mass is moved by a motor, vibrotactile actuators, inertial actuators, or other suitable types of actuating devices. Possible non-physical actuators include but are not limited to electrostatic friction (ESF), ultrasonic surface friction (USF), and other non-physical actuators. In another embodiment, haptic output device(s) 426 may use kinesthetic haptic feedback including, for example, solenoids to change the stiffness/damping of the housing of end user device 430, small air bags that change size in the housing, or shape changing materials.
The haptic effects that are output by haptic output device 426 can include but are not limited to varying degrees of vibrations, varying degrees of detents, or other types of haptic effects. If the end user device includes multiple haptic actuators, processor 432 may determine at which haptic actuator each haptic effect is to be executed and provided to the user. In such an embodiment, high level haptic parameters or streaming values are generated in the software code stored in the memory of the end user device. The parameters or values are processed by the processor and the appropriate voltage level for each haptic actuator is thereby generated. This allows the end user device to provide the appropriate haptic effects to the user and vary the amount or type of haptic effects through the different voltage levels that are generated for each haptic output device 426.
In another embodiment, end user device 430 may be a wearable haptic peripheral that is configured to be coupled to or positioned adjacent to a user. For example, end user device 430 is not required to include speaker 424A or display screen 424B but rather may be configured to be attached to a user's body or attached to clothes or furniture that are positioned adjacent to a user such that the haptic output device(s) of the wearable haptic peripheral can deliver haptic effects to the user's relevant body part (i.e., the body part where the punch has landed).
In another embodiment depicted in
Primary end user device 540 is an audio-video display device such as a television or TV that is configured to output signal 120 including the processed audio and/or video data or the audio and/or video components of composite signal 328. As shown on the block diagram of
As shown on the block diagram of
Multi-device system 538 may include means to ensure synchronization of the haptic content being output on secondary end user device 530 and the audio-video content being output on primary end user device 540. For example, primary end user device 540 and secondary end user device 530 may communicate directly with each other in order to ensure synchronization of the content being output on the respective devices. A playback locator signal 548 may be communicated between primary end user device 540 and secondary end user device 530 and function to determine a temporal reference point for signals 120, 122 or composite signal 328. The temporal reference point may be, for example, time-code, a counter, number of seconds of media playback consumed, current temporal playback position in the media file, or any other indication of playback position. A playback locator 546 of primary end user device 540 communicates a playback position on occasion to a playback locator 544 of secondary end user device 530, or vice versa, to ensure that the signal 120 or the audio and/or video component of signal 328 being output on a primary end user device 540 is rendered in a sufficiently synchronized fashion with signal 122 or the haptic component of signal 328 being output on a secondary end user device 530.
As previously mentioned, architectures disclosed herein with respect to the embodiments of
In broadcast haptics architecture 650, an audio-video recorder 608 records live audio and video data at the live event. More particularly, audio-video recorder 608 is a recording device configured to record or capture both images as video data and record or capture sound as audio data such as but not limited to a video camcorder, a smart phone, or the like. In any architecture embodiment described herein, video and audio of a scene or event may be separately captured or recorded (i.e., the audio and video capabilities may be on separate or different devices), or only video may be captured or recorded, or only audio may be captured or recorded. Further, in any architecture embodiment described herein, the audio-video recorder may include a system of multiple video and/or audio recording devices for multiple video and/or audio feeds. Live signal 612 including the raw audio and/or video data is transmitted to onsite processing or transmission station 652. In order to review or process the raw audio and/or video data to determine that the content thereof is appropriate or suitable for broadcasting, there is a buffer period, e.g., seven seconds, before live signal 612 is broadcast from onsite processing or transmission station 652. In this embodiment, a human operator watches live signal 612 at onsite processing or transmission station 652 and inserts haptics effects manually within the buffer period as represented by insertion 654. Stated another way, the human operator authors haptic effects with known haptic effect authoring tools during the buffer period or mandated broadcast delay. With manual insertion of haptic effects, there is no need for smart equipment or sensors at the live event. Rather, the human operator inserts or embeds predetermined or associated haptic effects with corresponding events. As an example, the live event may be a boxing match and an operator is watching the match from onsite processing or transmission station 652. When the operator witnesses a head punch, she inserts the predetermined or associated head punch haptic effects in the transmission stream. As another example, the live event may be a tennis match and an operator is watching the match from onsite processing or transmission station 652. When the operator witnesses an athlete hit the ball, she inserts the predetermined or associated ball contact haptic effects in the transmission stream.
After live signal 612 including the raw audio and/or video data has been processed or reviewed at onsite processing or transmission station 652, and insertion 654 has taken place such that haptic effects are embedded into the transmission stream, composite signal 628 having both the processed audio and/or video data and embedded haptic data is transmitted to a broadcasting station 656. Broadcasting station 656 then transmits composite signal 628 having both the processed audio and/or video data and embedded haptic data to one or more end user devices 430. As shown on
As described above, broadcast haptics architecture 650 does not require smart equipment or sensors at the live event due to the manual insertion of haptic effects. However, in another embodiment hereof, one or more sensors (not shown) record sensor data at the live event and a human operator receives the sensor data at onsite processing or transmission station 652. The manually inserted haptics effects as represented by insertion 654 may be based on the sensor data and/or live signal 612 including the raw audio and/or video data. Thus, in this embodiment, sensor data is captured and transmitted to onsite processing or transmission station 652 along with live signal 612 including the raw audio and/or video data. Exemplary sensors and sensed data to be utilized in this embodiment are described herein with respect to broadcast haptics architecture 750 and sensor 710.
In broadcast haptics architecture 750, an audio-video recorder 708 records live audio and video data at the live event and a sensor 710 records sensor data at the live event. In any architecture embodiment described herein, the sensor may be a system of multiple sensors for multiple sensor feeds (i.e., although described as a single sensor, multiple sensors may be utilized). Broadly stated, sensor 710 is a sensor configured to sense a parameter of an object, equipment, or person associated with the live event and to convert the sensed parameter into sensor data. In an embodiment, sensor 710 is coupled or attached to a piece of equipment 760 that is utilized at the live event, such as a boxing glove, a soccer, ball, a tennis racket, a helmet, or other type of sporting event related items, and sensor 710 is configured to sense movement or speed of an object and convert the sensed movement into sensor data. Sensor 710 may be an accelerometer, a gyroscope, a contact pressure sensor, a Global Positioning System (“GPS”) sensor, a rotary velocity sensor, or some other type of sensor that is configured to detect changes in acceleration, inclination, inertia, movement, or location. In another embodiment hereof, sensor 710 is associated with one or more athletes participating in the live event and may be a physiological signals (i.e., plethysmograph) sensor. Physiological signals as used herein include any signals describing the physical or biological state of an athlete that might also be translated into his emotional state. For example, physiological signals include but are not limited to signals relating to galvanic skin response, blood pressure, body temperature, ECG signals, EMG signals, and/or EEG signals. In another embodiment, sensor 710 is configured to sense another parameter of an object or equipment associated with the live event and convert the sensed parameter into sensor data but is not required to be coupled to the object or equipment. For example, sensor 710 may be configured to record a specific audio and/or video signal or feed that is to be converted into haptic effects. For example, sensor 710 may be a dedicated microphone for capturing the specific noises (i.e., engine noises) that can be translated into haptic effects. Unlike signals from audio-video recorder 708, audio and/or video signals from sensor 710 are not broadcasted and are not used to deliver visual and auditory feedback but are solely used for haptic purposes.
In this embodiment, all of the raw data (i.e., raw video, audio, and sensor data) is simultaneously transferred to a remote or offsite location all at once for processing or transformation thereof. More particularly, live signal 712 including the raw audio and/or video data is transmitted to onsite processing or transmission station 752, and live signal 712 including the raw sensor data is also transmitted to onsite processing or transmission station 752. Without performing any processing thereon, onsite processing or transmission station 752 transmits live signals 712 and 714 having raw audio and/or video data and raw sensor data, respectively, to a broadcasting station 756. Broadcasting station 756 then transmits live signals 712 and 714 having raw audio and/or video data and raw sensor data, respectively, to a local or regional treatment station 758. Local or regional treatment station 758 includes a processor, i.e., processor or encoder 116 of
In an embodiment, raw data from sensor 710 is transformed into haptic data that includes haptic event detections or identifications depending on the type of event and the accompanying sensor information such as force and intensity. Raw data from sensor 710 is converted from n (inputs) to haptic event detections or identifications represented as m (outputs) as described in more detail in U.S. Patent Application Publication 2015/0054727 to Saboune et al. (Attorney Docket No. IMM450), herein incorporated by reference in its entirety. The location, intensity, and nature of the haptic effect to be played can be relative to the characteristics of the haptic event identifications. For example, a knockout hit can be translated into a strong bump while a missed kick can be represented by a slight vibration. Other processing techniques for converting or transforming raw sensor data into a haptic data or commands may be utilized, which include but are not limited to, vibrations, surface friction modulation, skin pinch, skin squeeze, and the like. For example, U.S. Patent Application Publication 2014/0205260 to Lacroix et al. (Attorney Docket No. IMM439) and U.S. Patent Provisional Patent Application 61/968,799 to Saboune et al. (Attorney Docket No. IMM520P), each of which is herein incorporated by reference in their entirety, describe systems and methods for converting sensory data to haptic effects.
After local or regional treatment station 758 reviews or processes all of the raw data from signals 712, 714, local or regional treatment station 758 transmits the haptified live event to its local subscribers having end user devices 430. As shown on
As an example of broadcast haptics architecture 750, the live event is a soccer game which utilizes soccer balls equipped with sensors 710 that transmit signals including the raw sensor data through amplifiers (not shown) to onsite processing or transmission station 752. Onsite processing or transmission station 752 transmits the raw audio/video/sensor signals to numerous local or regional treatment stations 758. Local or regional treatment stations 758 process all the information and transmit the haptified live event to end user devices 430.
In broadcast haptics architecture 850, an audio-video recorder 808 records live audio and video data at the live event and a sensor 810 records sensor data at the live event. Similar to sensor 710, sensor 810 is a sensor configured to sense a parameter of an object, equipment, or person associated with the live event and to convert the sensed parameter into sensor data. In an embodiment, sensor 810 is coupled or attached to a piece of equipment 860 that is utilized at the live event, such as a boxing glove, a soccer, ball, a tennis racket, a helmet, or other type of sporting event related items, and sensor 810 is configured to sense movement or speed of an object and convert the sensed movement into sensor data. Sensor 810 may be an accelerometer, a gyroscope, a contact pressure sensor, a Global Positioning System (“GPS”) sensor, a rotary velocity sensor, or some other type of sensor that is configured to detect changes in acceleration, inclination, inertia, movement, or location. In another embodiment hereof, sensor 810 is associated with one or more athletes participating in the live event and may be a physiological signals (i.e., plethysmograph) sensor. In another embodiment, sensor 810 is configured to sense another parameter of an object or equipment associated with the live event and convert the sensed parameter into sensor data but is not required to be coupled to the object or equipment as described above with respect to sensor 710.
In this embodiment, signal 814 having raw data from sensor 810 is first locally treated before or at onsite processing or transmission station 852 and then transmitted through onsite processing or transmission station 852 to a central processing server 864 responsible for distribution to different end user devices. More particularly, live signal 812 including the raw audio and/or video data is transmitted to onsite processing or transmission station 852, and live signal 814 including the raw sensor data is transmitted to an onsite transformer or processor 862. Onsite transformer or processor 862, i.e., transformer 118 of
Onsite processing or transmission station 852 includes a processor, i.e., processor or encoder 116 of
In broadcast haptics architecture 950, an audio-video recorder 908 records live audio and video data at the live event and a sensor 910 records sensor data at the live event. Similar to sensor 710, sensor 910 is a sensor configured to sense a parameter of an object, equipment, or person associated with the live event and to convert the sensed parameter into sensor data. In an embodiment, sensor 910 is coupled or attached to a piece of equipment 960 that is utilized at the live event, such as a boxing glove, a soccer, ball, a tennis racket, a helmet, or other type of sporting event related items, and sensor 910 is configured to sense movement or speed of an object and convert the sensed movement into sensor data. Sensor 910 may be an accelerometer, a gyroscope, a contact pressure sensor, a Global Positioning System (“GPS”) sensor, a rotary velocity sensor, or some other type of sensor that is configured to detect changes in acceleration, inclination, inertia, movement, or location. In another embodiment hereof, sensor 910 is associated with one or more athletes participating in the live event and may be a physiological signals (i.e., plethysmograph) sensor. In another embodiment, sensor 910 is configured to sense another parameter of an object or equipment associated with the live event and convert the sensed parameter into sensor data but is not required to be coupled to the object or equipment as described above with respect to sensor 710.
In this embodiment, signal 914 having raw data from sensor 910 is first locally treated before or at onsite processing or transmission station 952 and then transmitted through onsite processing or transmission station 952 to directly to different end user devices without a central processing server as described in the embodiment of
More particularly, live signal 912 including the raw audio and/or video data is transmitted to onsite processing or transmission station 952, and live signal 914 including the raw sensor data is transmitted to onsite transformer or processor 962. Onsite transformer or processor 962 i.e., transformer 118 of
Onsite processing or transmission station 952 includes a processor, i.e., processor or encoder 116 of
In broadcast haptics architecture 1050, an audio-video recorder 1008 records live audio and video data at the live event and a sensor 1010A records sensor data at the live event. Sensor 1010A is similar to sensor 710 except that sensor 1010A includes a microprocessor or microcontroller, i.e., transformer 118 of
In this embodiment, transformation of sensor signals takes place inside the sensor itself. Sensor 1010A includes a microprocessor or microcontroller configured to process the raw sensor data and transform the raw sensor data into haptic data. The advantage of broadcast haptics architecture 1050 is that local sensor processing may result in improved or better haptic event detections or identifications due to the local higher loop rate. As described with respect to broadcast haptics architecture 750, raw data from sensor 1010A may be transformed into haptic data that includes haptic event detections or identifications depending on the type of event and the accompanying sensor information such as force and intensity or other processing techniques for converting or transforming raw sensor data into a haptic data or commands may be utilized.
More particularly, live signal 1012 including the raw audio and/or video data is transmitted to onsite processing or transmission station 1052, and signal 1022 including the haptic data is transmitted to onsite processing or transmission station 1052. Onsite processing or transmission station 1052 includes a processor, i.e., processor or encoder 116 of
Although not shown, broadcast haptics architecture 1050 may further include a broadcasting station and/or a central processing server for distribution to the end user devices. More particularly, onsite processing or transmission station 1052 may simultaneously transmit two separate or distinct signals including signal 1020 having processed audio and/or video data and signal 1022 having haptic data to the broadcasting station, and then the broadcasting station may transmit signals 1020, 1022 to the central processing server that is responsible for distribution to different end user devices 430. Stated another way, the method of delivery of signals 1020, 1022 to different end user devices 430 may be achieved as described in any of the previous embodiments, i.e., as described with respect to broadcast haptics architecture 850 of
In broadcast haptics architecture 1150, an audio-video recorder 1108 records live audio and video data at the live event and a sensor 1110 records sensor data at the live event. Sensor 1110 is similar to sensor 710 and sensor 1110 is a sensor configured to sense a parameter of an object, equipment, or person associated with the live event and to convert the sensed parameter into sensor data. In an embodiment, sensor 1110 is coupled or attached to a piece of equipment 1160 that is utilized at the live event, such as a boxing glove, a soccer, ball, a tennis racket, a helmet, or other type of sporting event related items, and sensor 1110 is configured to sense movement or speed of an object and convert the sensed movement into sensor data. Sensor 1110 may be an accelerometer, a gyroscope, a contact pressure sensor, a Global Positioning System (“GPS”) sensor, a rotary velocity sensor, or some other type of sensor that is configured to detect changes in acceleration, inclination, inertia, movement, or location. In another embodiment hereof, sensor 1110 is associated with one or more athletes participating in the live event and may be a physiological signals (i.e., plethysmograph) sensor. In another embodiment, sensor 1110 is configured to sense another parameter of an object or equipment associated with the live event and convert the sensed parameter into sensor data but is not required to be coupled to the object or equipment as described above with respect to sensor 710.
Live signal 1112 including the raw audio and/or video data is transmitted to onsite processing or transmission station 1152, and signal 1114 including the raw data from sensor(s) 1160 is also transmitted to onsite processing or transmission station 1152. Onsite processing or transmission station 1152 includes a processor, i.e., processor or encoder 116 of
Although not shown, broadcast haptics architecture 1150 may further include a broadcasting station and/or a central processing server for distribution to the end user devices. More particularly, onsite processing or transmission station 1152 may simultaneously transmit two separate or distinct signals including signal 1120 having processed audio and/or video data and signal 1114 including the raw data from sensor(s) 1160 to the broadcasting station, and then the broadcasting station may transmit signals 1120, 1114 to the central processing server that is responsible for distribution to different end user devices 430. Stated another way, the method of delivery of signals 1120, 1122 to different end user devices 430 may be achieved as described in any of the previous embodiments, i.e., as described with respect to broadcast haptics architecture 850 of
While various embodiments according to the present invention have been described above, it should be understood that they have been presented by way of illustration and example only, and not limitation. It will be apparent to persons skilled in the relevant art that various changes in form and detail can be made therein without departing from the spirit and scope of the invention. For example, although architectures described herein include transmission or broadcasting of video and/or audio data along with haptic data for a live event, architectures described herein may be modified to broadcast only haptic data from a live event similar to the way radios broadcast only audio data. For example, broadcasting haptic data only may be utilized in a sports game in which the score thereof is haptically broadcast every minute or other predetermined time, or the score thereof is haptically broadcast every time a score changes. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the appended claims and their equivalents. It will also be understood that each feature of each embodiment discussed herein, and of each reference cited herein, can be used in combination with the features of any other embodiment. All patents and publications discussed herein are incorporated by reference herein in their entirety.