Personalized sound management and method

Information

  • Patent Grant
  • 12183341
  • Patent Number
    12,183,341
  • Date Filed
    Tuesday, February 7, 2023
    2 years ago
  • Date Issued
    Tuesday, December 31, 2024
    2 months ago
Abstract
A personalized sound management system for an acoustic space includes at least one transducer, a data communication system, one or more processors operatively coupled to the data communication system and the at least one transducer, and a medium coupled to the one or more processors. The processors access a database of sonic signatures and display a plurality of personalized sound management applications that perform at least one or more tasks among identifying a sonic signature, calculating a sound pressure level, storing metadata related to a sonic signature, monitoring sound pressure level dosage levels, switching to an ear canal microphone in a noisy environment, recording a user's voice, storing the user's voice in a memory of an earpiece device, or storing the user's voice in a memory of a server system, or converting received text received in texts or emails to voice using text to speech conversion. Other embodiments are disclosed.
Description
FIELD OF THE INVENTION

The invention relates in general to methods of managing sound, and particularly though not exclusively, is related to personalized sound management.


BACKGROUND OF THE INVENTION

The world of two hundred years ago is substantially different than the present day earth. Similarly, the acoustic environment that surrounds us is also changing. For example, the sounds of a large city have changed as the mode of transportation transitioned from walking and horse and buggy, to cars, subways, and airplanes.


In general, humans are continuously inundated by a diversity of sounds. Many of the sounds are not critical to our lives but our brain processes these sounds and tries to distinguish between them. Background sound levels can also make it difficult to hear sounds that are important. Too much acoustic information can cause an auditory overload that can impact both the health and safety of an individual.


SUMMARY

The invention relates in general to methods and system for implementing a suite of personalized sound applications for modifying a user's acoustic environment and more particularly, though not exclusively, to facilitating the adoption of the technology, ensuring the technology functions properly, protecting both the manufacturers and consumer, and providing user selection and control over the management of sound.


At least one exemplary embodiment is directed to a method of personalized sound management comprising the steps of: selecting at least one of a plurality of personalized sound management applications through a client system where the user selects the at least one of the plurality of personalized sound management applications from a website; accepting a subscription contract for using the at least one of the personalized sound management applications; and loading the selected at least one of the plurality of applications from a server system to a device where the device has at least one microphone, at least one speaker, and a processor configured to identify sonic signatures where each sonic signature is identified using a Gaussian mixture model.


At least one exemplary embodiment is directed to a method of implementing personalized sound management comprising the steps of: recording sound with a microphone of a communication device; analyzing the sound for acoustic information relevant for personalized sound management applications; storing a sonic signature in a memory of the communication device; calculating a sound pressure level of the sonic signature; and attaching and storing metadata related to the sonic signature and sound pressure level including a time stamp and geocode.





BRIEF DESCRIPTION OF THE DRAWINGS

Exemplary embodiments of the present invention will become more fully understood from the detailed description and the accompanying drawings, wherein:



FIG. 1 illustrates a block diagram of the interaction of personalized sound management in accordance with at least one exemplary embodiment;



FIG. 2 illustrates a block diagram of a partial list of applications for personalized sound management in accordance with at least one exemplary embodiment;



FIG. 3 illustrates a block diagram of the interaction of a personalized management enabled device with other devices in accordance with at least one exemplary embodiment;



FIG. 4 illustrates a flow chart of a process of providing personalized sound management in accordance with at least one exemplary embodiment;



FIG. 5 illustrates a flow chart of an application testing in accordance with at least one exemplary embodiment;



FIG. 6 illustrates a flow chart of the testing of a personal sound management device in accordance with at least one exemplary embodiment;



FIGS. 7a and 7b are diagrams illustrating a consumer purchase process in accordance with at least one exemplary embodiment;



FIG. 8 illustrates a flow chart of registering a new device that includes personalized sound management applications in accordance with at least one exemplary embodiment;



FIG. 9 illustrates a flow chart of enabling the new device in accordance with at least one exemplary embodiment;



FIG. 10 illustrates a flow chart of updating a unit or device in accordance with at least one exemplary embodiment;



FIG. 11 illustrates a diagram of a device for implementing personalized sound management in accordance with at least one exemplary embodiment;



FIG. 12 illustrates a block diagram of a device for implementing personalized sound management in accordance with at least one exemplary embodiment;



FIG. 13 illustrates a diagram of a communication device or earpiece configured to provide sonic signatures to a sonic signature database in accordance with at least one exemplary embodiment; and



FIG. 14 illustrates a block diagram of a cell phone capturing a sonic signature and providing the sonic signature to a database of sounds in accordance with at least one exemplary embodiment.





DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS OF THE PRESENT INVENTION

The following description of exemplary embodiment(s) is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.


Processes, techniques, apparatus, and materials as known by one of ordinary skill in the art may not be discussed in detail but are intended to be part of the enabling description where appropriate. For example, specific computer code may not be listed for achieving each of the steps discussed, however one of ordinary skill would be able, without undo experimentation, to write such code given the enabling disclosure herein. Such code is intended to fall within the scope of at least one exemplary embodiment.


Additionally, the sizes of structures used in exemplary embodiments are not limited by any discussion herein (e.g., the sizes of structures can be macro (centimeter, meter, millimeter, micro (micro meter), nanometer size and smaller).


Notice that similar reference numerals and letters refer to similar items in the following figures, and thus once an item is defined in one figure, it may not be discussed or further defined in the following figures.


In all of the examples illustrated and discussed herein, any specific values, should be interpreted to be illustrative only and non-limiting. Thus, other examples of the exemplary embodiments could have different values.



FIG. 1 is a diagram of personal sound management in accordance with at least one exemplary embodiment. In our global community 100, each person is unique. The differences between individuals can be genetic, cultural, environmental, personal, or physical to name just a few. The combination of these traits is what makes us unique. One of the main senses a human relies on is our auditory sense. Hearing impacts every aspect of our life from communication to safety. How each individual perceives and uses sound is also unique.


From an acoustic perspective, sound pressure levels have been rising steadily. The diversity of sounds is also increasing. The human brain continuously processes the acoustic information provided by the ears. Both the sound pressure levels and the sound diversity puts increasing strain on a person to determine what they need to hear versus what they are hearing. Ultimately, this “acoustic overload” can manifest itself in physical ailments and health risks such as stress, sleeplessness, and depression.


A person 102 has an acoustic space 104 from which a majority of sounds they receive emanate. Some of the sounds are useful to person 102 and some sounds may have no use. Acoustic space 104 can be large or small. Acoustic space 104 will change with time and location. For example, acoustic space 104 can be a room, a stadium, a forest, an automobile, a plane, or the ear canal of person 102. Personalized sound management (PSM™) 106 is the ability to modify what is received from acoustic space 104 thereby tailoring or personalizing the received acoustic information to meet the needs of person 102.


Devices 108 are a source of acoustic information within acoustic space 104. In general, device(s) 108 have a speaker, a microphone, or one or more of both. In general, application hardware and software 110 is incorporated in, although can be external also, device(s) 108 to allow personalization of acoustic space 104. Person 102 selects the applications and controls device(s) 108 to modify acoustic space 104 to meet their personal needs and wants. The benefit and utility of personalizing and managing sound received by person 102 will be disclosed in more detail.



FIG. 2 is a diagram illustrating a partial list of applications for personalized sound management in accordance with at least one exemplary embodiment. A user 202 has an acoustic space 204 that includes both real sounds and potential sounds. In many cases, acoustic space 204 can be inundated with a variety of sounds not under control of user 202. This lack of control over acoustic information and sound within acoustic space 204 can make for a reduction in quality of life, loss of efficiency, or more seriously affecting user's 202 health and safety.


In at least one exemplary embodiment, a user 202 selects 201 at least one personalized sound management application from a personalized sound management applications list 206. Although only a partial list, applications such as Safe Space™ 216, EarGuard™ 218, Quiet Call™ 220, Ear Sticky™ 230, Hearable™ 222, Always On-Recording™ 224, Earbliss™ 226, Hear & Tell™ 232, and Ear Mail™ 228 will be used to illustrate how acoustic space 204 is personalized and managed for the benefit of user 202. A more detailed explanation of personalized sound management applications 206 will be described.


The selected applications of user 202 are loaded into ROM 212. An operating system 208 is configured operably with device 210 and ROM 212. Operating system 208, in conjunction with personalized sound management applications 206, provides an interface to the user for personalized sound management 214 to meet user needs to manage user acoustic space 202 and for controlling device 210.


ROM 212 can be read only memory such that the selected applications from personalized sound management list 206 cannot be tampered with or rewritten. However in at least one exemplary embodiment ROM 212 can also be memory that can be read and written to so that a user can change settings. Alternately, ROM 212 can be other types of memory such as fe-ram, phase change memory, magneto-resistive memory, hard drive, sram, dram, eprom, eeprom, and other non-volatile or read only memories where the selected applications are secure from any type of downloading, tampering, or modification through physical or software protection. Monitoring of ROM 212 and the selected applications can also be added where notification is sent or device 210 is disabled when an improper action is detected. ROM 212 ensures that an application will operate as disclosed for user 202.


Components of device 210 can be built in a single unit or operatively coupled units. For example, multiple devices of 210 can be wired, optically connected, wirelessly connected or a combination thereof. Operating system 208 can be run remotely to operate device 210 or reside within device 210. Similarly, ROM 212 can be located in device 210 or remote to device 210. In at least one exemplary embodiment, operating system 208 resides in ROM 212. Device 210 typically has at least one microphone or at least one speaker, or both. The at least one microphone provides acoustic information for use in conjunction with personalized sound management applications. The at least one speaker provides acoustic information to user 202 and acoustic space 204. Device 210 has a microprocessor (not shown) for running applications 206. In at least one exemplary embodiment the microprocessor is dedicated to running personalized sound management applications 206.


In general, personalized sound management applications are customizable point solutions that allow user 202 to handle a variety of tasks associated with managing acoustic space 204. The selection of personalized sound management applications will depend on the person, the device, and the acoustic space being managed. Non-limiting examples of devices that can use personalized sound management applications with the appropriate hardware and operating system, are earpieces, media devices, and vehicles.


Described hereinbelow are brief overviews of the user selected personalized sound management applications:

    • Safe Space™ 216 is an “intelligent hearing” application that detects and outputs a response based on the identification of a recognized sound. The concept of recognizing sounds is described as Sonic Signature Detection. Safe Space™ 216 also includes a user defined hierarchy and can provide responses based on the hierarchy.


A brief example is provided of how Safe Space™ 216 manages the acoustic space corresponding to the interior of an automobile. As is well known, automobiles are designed to have a quiet interior space. Sound insulation is deliberately placed around the car interior to attenuate sounds coming from the car exterior. Furthermore, automobile drivers often listen to high volume music while driving. This combination makes it difficult to hear sounds such as emergency vehicles. People often get seriously injured because they hear the emergency vehicle when it is in close proximity to the vehicle and panic causing an accident. All descriptions below describing personalized sound management applications assume that the device using the application has incorporated within it the hardware and software required to run the application and perform the application function.


In this example, Safe Space™ 216 detects emergency vehicle horns when they are a substantial distance away. In at least one exemplary embodiment, a microphone is exterior to the car and can pick up sounds in the ambient or car exterior. Sonic signatures related to emergency vehicles such as fire trucks, ambulances, and police cars are stored in the system. Safe Space™ 216 analyses sounds from the microphone. A response is provided if one of the stored sonic signatures is detected. In at least one exemplary embodiment, upon detecting a fire truck siren, Safe Space™ 216 can initiate playing the identified signal through the car stereo system for the car driver to hear and respond to. In at least one exemplary embodiment, Safe Space™ 216 can calculate the direction, distance, and street (through GPS) of the approaching emergency vehicle. The information is then provided visually or vocally to the driver. For example, the car stereo automatically turns down the music and states through the speaker system that an ambulance is coming east bound on 3.sup.rd Street or an ambulance is approaching from the right.


EarGuard™ 218 is a personalized sound management application that improves listening quality and safety. Hearing disorders are increasing at a very high rate. Many people live in an environment such as a noisy urban environment or an industrial manufacturing area where sound pressure levels are consistently high. Moreover, people themselves are subjecting themselves to loud and sustained sounds. Examples of extreme or sustained sound exposure are portable media players using earbuds, sport shooting, and rock concerts. The trend is a growing population of people that have or will have hearing problems.


The ear is a very sensitive instrument having a large dynamic range that allows us to hear sounds ranging from a whisper to a shout. Subjecting the ear to man-made cacophony is now known to metabolically exhaust the highly specialized sensory cells in the inner ear, causing them to die and be replaced by scar tissue in the hearing organ. Ear Guard™ 218 is an application that monitors the sound exposure of the user and protects the ear from damage.


Briefly, Earguard™ 218 improves listening quality and safety by employing a personal hearing zone that measures sound levels within the ear and in one exemplary embodiment an intelligent level control to protect against over amplification. Earguard™ 218 includes a sound pressure level (SPL) dose management system. The SPL dose management system takes into account both transient and long term sound. In at least one exemplary embodiment, when sound at the user's eardrum is above a certain threshold known to begin the process of metabolically exhausting the ear, the metric known as SPL Dose increases. The SPL dose decreases when sound is below that threshold according to a recovery function that mirrors the recovery of the ear from excessive sound exposure. In at least one exemplary embodiment, Earguard™ 218 will indicate to the user damage to the ear is likely if the sound exposure is continued. In this scenario, the user has the control to take the appropriate action to protect his/her ears.


In at least one exemplary embodiment, an intelligent level adjustment system is a personalized sound management application that automatically estimates the preferred listening level of an audio content signal (e.g. speech or music audio from a device) depending on an analysis of the level, other acoustic features of the ambient environment, and an analysis of the audio content signal. Thus, a human-machine relationship is nurtured with this bi-directional control flow from human to the intelligent level adjustment system and from intelligent level adjustment system to the human user.


A substantial benefit of an application such as Earguard™ 218 is the protection of the ear from damage. Furthermore, this safeguard if propagated will prevent hearing loss thereby reducing the financial burden on the economy in years to come for the individual, business, and government for hearing loss remedies. Thus, using Earguard™ 218 could not only stem the growth in hearing related problems but greatly reduce it over the years to come.


Quiet Call™ 220 is an application that allows user 202 having heard a voice message to respond to the remote caller through a non-verbal means. An example of a non-verbal response is a key-pad entered text message. The entered text message is converted to a speech audio message and is sent to the remote caller. The caller then receives the speech audio message.


An example of the utility of Quiet Call™ 220 is illustrated when user 202 is in an important business meeting but is required to provide input to another remote meeting. User 202 receives the voice message sent from someone in the remote meeting. User 202 responds by entering the response through his/her phone keypad; the text is converted to voice and sent to the person at the remote meeting who sent the original message. User 202 does not have to interrupt the meeting to listen to the voice message nor to reply. Thus, the meeting attended by user 202 can move forward with little or no loss of momentum that would occur if the meeting was disrupted by the call.


Hearable™ 222 is a speech enhancement application that improves voice communication. For example, Hearable™ 222 can be used with an earpiece device having at least two microphones, an ambient sound microphone for receiving sounds in the ambient environment and an ear canal microphone for receiving sounds in the ear canal. A common use for the earpiece would be with a communication device such as a cell phone or other phone system. The earpiece using the Hearable™ 222 application would normally use the ambient sound microphone for receiving and transmitting the user's spoken voice. In this mode, the user's voice will be natural sounding and easily recognizable on the receiving end.


In a noisy environment it can be difficult to carry on a telephone conversation. The ambient sound microphone will pick up the voice and the noise in the ambient environment. The earpiece will switch to the ear canal microphone when the ambient sound microphone detects a high background noise level. The user's voice is readily picked up by the ear canal microphone but the noise in the ambient environment is substantially reduced. Switching to the ear canal microphone allows the receiving party to clearly hear the user's voice. The problem with using the ear canal microphone is that user's voice received in the ear canal sounds different because of frequency roll off in the upper spectrum of the voice range. Although the user can be heard clearly, the user's voice may not sound right to the receiving end.


Hearable™ 222 is a personalized sound management application that improves the sound quality of the user's voice. Hearable™ 222 uses a combination of the sound received by the ambient sound microphone and the ear canal microphone to create a more natural sounding voice. The combination of the two signals is a function of the background noise level. Explaining further the signal from the ambient sound microphone is used less as the background noise level increases. Hearable™ 222 allows a user to have a conversation in a noisy environment while providing a high quality voice signal that is intelligible and recognizable at the receiving end.


Always On-Recording™ 224 is a personalized sound application that acts as its name implies. A device using the Always On-Recording™ 224 application is recording the most current audio information for user recall. The recording is stored in a buffer that will allow the user to immediately access the audio information. The buffer will have a finite amount of storage. The recording time will be a function of the buffer or memory available for the Always On-Recording™ 224 application.


Always On-Recording™ 224 provides utility for short durations of recording. For example, in an application where a user is receiving information such as a phone number or driving directions. The user knows that the device employing Always On-Recording™ 224 has the stored acoustic information and can immediately listen-again to the buffer contents thereby repeating the phone number or driving directions. Similarly, if the user was discussing a contract term and wanted to know exactly what the person said, the user could immediately re-listen to make sure that what they thought they heard is exactly what they heard.


Earbliss™ 226 is a personalized sound management application to provide acoustic isolation from someone who snores while allowing other sounds to be heard. A large percentage of the population suffers though sleepless nights because of the sounds generated by people with sleep apnea and more generally due to loud snoring. Moreover, sleep deprivation can have serious consequences related to health and disposition.


Earbliss™ 226 is a sleep zone technology that utilizes sonic signature detection to insulate sleep partners against intrusion from snoring while still providing awareness of priority sounds. In this example, a sonic signature is acoustic information related to the sounds of a person snoring. In at least one exemplary embodiment, Earbliss™ 226 is an application that is used in conjunction earpieces.


Sealing the ear canal with ear plugs will attenuate the snoring but also blocks out all sounds. Hearing is one of the most vital senses that we have. Under normal conditions we cannot turn off our hearing which allows us to wake in a critical situation. Ear plugs will block out sounds of consequence. For example, the user may not hear their baby crying or a child leaving their room.


Earbliss™ 226 enables a user to attenuate snoring while hearing other sounds around them. The earpieces fit in the ear canal of the user to seal or partially seal the ear canal. The earpieces have an ambient sound microphone for receiving acoustic information from the ambient environment and an ear canal receiver for providing sound to the user's ear canal. As mentioned previously, the earpieces will have a sonic signature related to snorer's snoring stored in memory in the device.


The ambient sound microphone picks up all sounds in the ambient environment including the snorer's snoring. The earpiece processes all the acoustic information coming from the ambient sound microphone and looks for signals similar to stored sonic signatures. Pattern recognition approaches are applied based on the known sonic signatures to detect the snoring sounds from their corresponding sonic signatures. More specifically, sonic signatures can then be compared to learned models to identify a corresponding snoring sound. Once identified the snoring sound is suppressed and not output to the ear canal by the ear canal receiver. Thus, the wearer of the earpieces does not hear the snoring.


Conversely, sounds in the ambient environment that are not recognized through the processing of acoustic information by the earpieces can be passed transparently to the ear canal receiver for reproduction within the ear canal. In this mode, the sound produced in the ear canal sufficiently matches the ambient sound outside the ear canal, thereby providing a “transparency” effect other than the suppressed sonic signature (snoring). The earpieces can also enhance sound. For example, the earpieces having sonic signatures related to a fire truck siren or a baby crying can detect either signal and then amplify the signal (fire truck siren, baby crying) so as to make the wearer of the earpieces aware of the signal's detection. Thus, Earbliss™ 226 modifies the acoustic environment of the user to eliminate what does not need to be heard while allowing the user to be aware of other sounds in a normal context.


Ear Mail™ 228 is a personalized sound management application for converting text to voice. In particular, Ear Mail™ 228 provides great utility to users of email and text messaging although not limited to these examples. Email and text messaging is becoming a very popular form of communication among a large portion of the population. It is not always convenient or in some cases not prudent to review written text depending on the situation.


Ear Mail™ 228 converts the text of a message or email to a speech audio message using a text to speech algorithm. The converted speech audio message is replayed through a loudspeaker coupled to a device using the Ear Mail™ 228 application. For example, Ear Mail™ 228 used in a smart phone coupled to the stereo system of a car through a blue tooth connection could playback the text or emails through the car speakers. The user could hear their messages while driving safely down the road.


Ear Sticky™ 230 is a personalized sound management application for recording information that can be saved and used for future use. In at least one exemplary embodiment, Ear Sticky™ 230 is a mobile communication application that can be used in a device such as a cell phone or an earpiece that is operably coupled to other devices. Ear Sticky™ 230 can record communication through a device or sounds in the ambient environment. Sounds in the ambient environment are recorded by an ambient sound microphone.


In a first example of an Ear Sticky™ 230 application, a conversation between a husband and wife occurs and a list of items to pick up at several stores is disclosed. The user of Ear Sticky™ 230 does not need to write down or remember this list. The conversation is being recorded and stored in a buffer. The user activates Ear Sticky™ 230 to store the recorded conversation to be reviewed at a later time. Thus, the user of Ear Sticky™ 230 could recall and listen to the list on the way home to ensure that the right items are picked up at the appropriate store.


In a second example of an Ear Sticky™ 230 application, a device records with an ambient sound microphone. For example, the user of Ear Sticky™ 230 comes up with a great concept to solve a problem he or she has been working on for some time. The user can enable Ear Sticky™ 230 and uses the ambient sound microphone to record his or her voice to convey the idea, concept, or thoughts and store it for review at a later time. In general, Ear Sticky™ 230 provides utility and convenience in storing and recalling sounds in one's acoustic space.


Hear & Tell™ 232 is a personalized sound management application for recording a sound, training a Gaussian mixture model to learn features of the sound and then storing the Gaussian mixture model in memory of the device. A user of Hear & Tell™ 232 can record the sound or provide a sound, called herein as a sonic signature. The device is operably coupled to at least one microphone to compare sounds received from the microphone against the stored sonic signature. The device can perform an operation that modifies the user's acoustic space 204 once a sound is identified as being similar to a sonic signature. Examples of several operations the device can perform are passing the detected signal through to the user, boosting the sound such that the user is made aware of the detected sound, rejecting the sound so the user does not hear the detected sound, attenuating the detected sound, and replacing the detected sound with an alternate sound to name.


The Hear & Tell™ 232 application will store a Gaussian mixture model (GMM) for every sonic signature that it has been trained to recognize. Each GMM is completely specified by a mixture of mean vectors, a mixture of covariance matrices, and a mixture of weights.


An example of a warning sound (e.g. siren of emergency vehicle) will be used to further illustrate the Hear & Tell™ 232 learning application. Each GMM provides a model for the distribution of the feature statistics for each warning sound in a multi-dimensional space. Upon presentation of a new feature vector, the likelihood of the presence of each warning sound can be calculated. In at least one exemplary embodiment, each warning sound's GMM is evaluated relative to its anti-model, and a score related to the likelihood of that warning sound is computed in order to determine if a sound is detected. A threshold can be applied directly to this score to decide whether the warning sound is present or absent. Similarly, a sequence of scores can be relayed and used in a more complex rule set to determine absence or presence of the sound. Thus, Hear & Tell™ 232 allows a user to store, model, and train a device for sonic signatures under user control and selection thereby allowing a user to modify their acoustic space through the detection of a sound and response by the device.


In general, personalized sound management applications are customizable point solutions that allow user 202 to handle a variety of tasks associated with managing acoustic space 204. The selection of personalized sound management applications will depend on the person, the device, and the acoustic space being managed. Examples of devices that can use personalized sound management applications with the appropriate hardware and operating system, but not limited to these examples, are earpieces, media devices, and vehicles.


As described hereinabove, a suite of personalized sound management applications 206 are provided. The user 202 can select the desired applications and have control over their acoustic space 204. The selected applications are stored in ROM 212. Under user control, user 202 selects the parameters of personalized sound management applications 206 using operating system 208 for implementing applications 206 in device 210 for managing acoustic space 204 based on the individual's needs and wants.



FIG. 3 is a diagram illustrating a module 300 for implementing personalized sound management in accordance with at least one exemplary embodiment. Module 300 comprises an H-Chip™ 302, a ROM 304, an operating system 306, and user selected personalized sound management applications 308. In at least one exemplary embodiment, module 300 comprises the H-Chip™ 302 with ROM 304 built into H-Chip™ 302. Both operating system 306 and user selected personalized sound management applications 308 are stored on ROM 304. Alternately, H-Chip™ 302 and ROM 304 can be separate chips allowing for a larger block of memory.


H-Chip™ 302 is a microprocessor, DSP, or logic circuit that implements personalized sound management applications 308. H-Chip™ 302 is optimized for low power dissipation while running user selected personalized sound management applications 308. In at least one exemplary embodiment, H-Chip™ 302 can be a dedicated engine for running applications 308.


Module 300 comprises an application engine (firmware) and a dedicated processor that simplifies the integration of user selected and user controlled personalized sound management applications 308 into a device. In an exemplary embodiment, module 300 is integrated into each device in devices 314. Devices 314 typically, although not always have at least one microphone and at least one speaker. User 310 in conjunction with module 300 can personalize how each device manages sound to the user.


An additional aspect of module 300 is that a third party manufacturer merely builds the acoustic portion of their device around module 300. In an exemplary embodiment, a manufacturer does not and cannot modify operating system 306 and user selected personalized sound management applications 308. Thus, the manufacturer saves time, effort, and money through the seamless integration module 300 into a single hardware solution for maximum transportability into 3.sup.rd party form factors.



FIG. 4 is a diagram illustrating a process of providing personalized sound management in accordance with at least one exemplary embodiment. In general, the technology related to providing personalized sound management comprises hardware components and software. In at least one exemplary embodiment, the hardware 402 comprises transducers 404 and H-Chip™ 406.


Transducers 404 are speaker and microphones for respectively providing and receiving sound in an acoustic space. Depending on the application, the size and form factor of transducers 404 may be a critical design parameter. Transducers 404 can be a speaker for providing high fidelity sound to enhance a user's experience. Similarly, transducers 404 can be a microphone for receiving acoustic information and in some cases can pick up sounds inaudible to the user.


H-Chip™ 406 can be a microprocessor, digital signal processor, logic circuit, or applications specific integrated circuit for use implementing the software applications listed hereinabove and other programs related to personalized sound management. H-Chip™ 406 includes operating system 408 specific for managing the device as it relates to personalized sound management and allows user control to adjust the parameters of each application for their specific needs.


Hardware 402 can be integrated into existing devices or next generation devices for adding utility, providing safety, and allowing personalization of a user's acoustic space. Providing hardware 402 enables device manufacturers 410 to rapidly integrate the technology into their products. This will quicken the adoption cycle of the technology for the benefit of the general public in health and safety, and for the individual through personalization. Hardware 402 can be a circuit board to which transducers 404 and H-Chip™ 406 are operatively attached.


Device manufacturers 410 can provide their own hardware 414 or use transducers 404 and H-Chip™ 406. The operating system and application software is incorporated 412 and stored in read only memory (ROM). In at least one exemplary embodiment, operating system 408 and the application software can be stored in ROM. Using read only memory for storage prevents device manufacturers 410 or consumers from tampering with the software code thereby maintaining the integrity of the system and how it performs for the user. Note herein when referring to ROM storage, at least one exemplary embodiment can include RAM or other read/write storage methods.


People in the consumer product industry have stated that any new product and even old products in the hands of consumers may produce liability issues that companies associated with the product have to deal with. This often delays an introduction of a product or adds substantial cost to a product launch. In at least one exemplary embodiment, device manufacturers 410 provide their product for certification 416. In at least one exemplary embodiment, certification 416 is performed by an independent company from device manufacturers 410. Certification 416 is a process whereby a new product is exhaustively tested to ensure the device will perform to specification under a variety of conditions and tolerances. In particular, certification 416 tests the operating system and the device performance in implementing the applications related to personal sound management. Should the device not pass certification testing, it can be repaired or redesigned. The repaired or redesigned device can then undergo certification 416 to determine if it meets the test specifications.


In at least one exemplary embodiment, an insurance policy 418 may be provided covering the device manufacturers 410 using operating system 408, hardware 402, and the application software. This provides substantial benefits in that the device manufacturers 410 can deploy the technology with less risk. Moreover, each company providing devices will know that the personalized sound management system is similar or identical to others in the market. As mentioned hereinabove, the operating system and application software can be stored in ROM ensuring it cannot be rewritten. Additional measures can be taken to determine if the software is being used out of context and shutting down the operation of the device.


Once tested and certified, device manufacturers 410 can manufacture, market, and sell devices with personalized sound management 424 and be covered from a liability perspective under provisions in the insurance policy 418. A consumer purchases 420 a certified device and may also be covered under the same insurance policy 418 or may have a second policy directed to the consumer. Consumer protected 422 under insurance policy 418 and having certification 416 will instill consumer confidence to the quality and reliability of the device. This will be discussed in greater detail hereinbelow. Thus, this process is very efficient in creating manufacturer adoption of the personalized sound management technology while protecting both device manufacturers 410 and the consumer.



FIG. 5 is a diagram illustrating application testing in accordance with at least one exemplary embodiment. Device manufacturers 502 develop, prototype, and manufacture products 506. Products 506 include hardware and operating system 504 for running and performing Personalized Sound Management (PSM™) applications 508.


In at least one exemplary embodiment, hardware and operating system 504 is capable of running all personalized sound management applications 508. Products 506 may or may not use all of personalized sound management applications 508. Similarly, an end user or customer of products 506 may or may not use all of personalized sound management applications 508.


Hardware and operating system 504 of products 506 are loaded with all personalized sound management applications 508. The hardware and operating system 504 are exercised in testing 510 to ensure that it can run personalized sound management applications 508 per specification. Product 506 may or may not be tested for all personalized sound management applications 508 depending on the capability of the device. Products 506 will be tested 510 for the implementation of all personalized sound management applications it is capable of running to meet user specifications for the product (i.e. product certification 512). In at least one exemplary embodiment, should a product run only a subset of personalized sound management applications, the product can be designed and tested to lock out the running of applications not specified for the particular product (even though hardware and operating system 504 can run personalized sound management applications 508).


A scale of economy for developing and manufacturing hardware and operating system 504 is achieved by having a number of different device manufacturers 502 standardize on the engine (hardware and operating system 504) for implementing personalized sound management. Consistency of product, ability to expand scope of product, and lowering cost through volume manufacturing are all achieved by this methodology. In the example where liability insurance may be purchased by the product certification company (or other company) testing 510 all personalized sound management applications 508 in products 506 ensures that hardware and operating system 504 perform per specification independent of the product it is placed in. Furthermore, having little or no variation on the operation of hardware and operating system 504 and personalized sound management applications 508 minimizes risk. An added benefit is that a large statistical database of diversified products is generated that will allow improvements to managing personalized sound management devices to improved device performance as well as lower liability risk. The consumer benefits by having a highly exercised and stable product that improves their health, safety, and quality of life.



FIG. 6 is a diagram illustrating testing of a personalized sound management product 602 in accordance with at least one exemplary embodiment. In at least one exemplary embodiment, testing of all products incorporating personalized sound management applications are tested by a single entity. Personalized sound management product 602 can be developed and manufactured by a number of different companies. As previously mentioned, this allows the single entity, should it so desire to purchase a liability insurance policy that covers the manufacturers making personalized sound management product 602 and consumers using personalized sound management product 602.


Personalized sound management product 602 is tested by certification laboratory 604. In at least one exemplary embodiment, certification laboratory 604 is independent from the different manufacturers that develop personalized sound management product 602 thereby providing unbiased testing. Testing 614 of hardware 606, software 608, and manufacturing 610 are performed on personalized sound management product 602. Testing 614 is related to operation and performance of personalized sound management applications. Hardware testing includes the processor, transducers, and other elements, which implement the personalized sound management of the device. Testing of software 608 includes the operating system and personalized sound management applications. Testing 614 includes reliability 616 and quality 618 testing. Testing 614 can also include human testing because in some embodiments, personalized sound management product may be worn in an orifice of the human body.


Upon passing testing 614 the personalized sound management product 602 is certified by certification laboratory 604. A product certified 620 can include some form of warranty, indemnification, and liability insurance for the device. Problems or issues with personalized sound management product 602 are reported to the company should it fail for any reasons. Corrective action can be taken and the device retested to determine if the issue has been resolved.


In at least one exemplary embodiment, certification laboratory 604 or its parent company will brand the product by placing a trademark or company logo on the approved personalized sound management product 602. The trademark or logo represents the incorporation of its personalized sound management technology in the product (hardware, software, and intellectual property) as well as the certification indicating the quality and reliability of the device.



FIG. 7a is a diagram illustrating a consumer purchase process for a personalized sound management product in accordance with at least one exemplary embodiment. A consumer reviews a number of different devices such as earpieces in a display area. The display area allows the consumer to look, feel, and touch the hardware. The consumer focuses on an earpiece 701 in the display area and wants to learn more about the device.


A sales person can demonstrate earpiece 701 allowing the consumer to test the functions and performance of the device. A demonstration room or test environment can be used to illustrate performance differentiation when compared to other earpieces. In at least one exemplary embodiment, earpiece 701 is an in-ear device that has a removable and disposable sealing section. This allows consumer testing using earpiece 701 while maintaining a sanitary condition by disposing of the sealing section after each test session.


The consumer indicates to the sales person that he or she would like to purchase earpiece 701. The sales person accesses a computer that is coupled to server system for completing a purchase transaction. The consumer provides registration information that is entered into a database in a step 703. The consumer or the sales person can enter the information into the database. The registration information 705 comprises information such as personal, financial, business, and preference. Personal information relates to data such as age, sex, home address, telephone number, email address, which identifies the consumer. Financial information relates the consumer's ability to pay such as credit history, credit card for billing, job information, and banking institutions. Business information relates to a business purchase of the component. Similar to above, business identification and business financial information would be entered with the purchase. Preferences relate to things such as how the consumer wants to be contacted (email, phone), do they want to be made aware of upgrades for their device, and other products they might have interest in. Other information or user experiences could also be collected during the registration process.


The unit selected by the consumer is then entered into the system in a step 707. In at least one exemplary embodiment, the system identifies the unit and provides a listing of the hardware, software, and subsystems that work with the earpiece 701. The consumer can look at the options available in a step 709 and select among these options. In at least one exemplary embodiment, the selection process can be facilitated by topic, occupation, or other characteristics by providing a list of typical user features applied to earpiece 701. For example, the consumer is purchasing earpiece 701 for a multimedia player. Earpiece 701 provides a substantial improvement in noise attenuation, musical clarity, definition, and dynamic range. The consumer is interested in the safety aspect (and selects this topic on the kiosk) provided by earpiece 701 in alerting the user to an emergency event. For example, earpiece 701 can detect and make the user aware of sirens, emergency sounds, alarms, etc. that would be difficult to hear when listening to music. The consumer by looking at the “safety” topic can select appropriate applications software and sonic signatures to alert the user of a safety situation for their specific device. The consumer can continue to look through other topics and select hardware and software that supplements and implements the desired user experience.


As mentioned previously, the options can be provided in a step 709 in many different formats. Another format of providing options is by occupation. For example, a dentist may be provided a package to outfit the entire office and patients. The earpieces substantially reduce noise related to drilling which are both a psychological barrier for dentistry and a hearing loss mechanism for people working in a dentist office. The occupation package can provide wireless communication between devices allowing dentists, assistants, and patients to talk or listen to music even in a noisy environment. Replaceable in-ear sealing sections would allow the staff to maintain sanitary conditions with each new patient. After reviewing the consumer selects the appropriate hardware and software to personalize their sound space (on the kiosk or computer) in a step 711.


In at least one exemplary embodiment, the selected application software provided to earpiece 701 is a subscription whereby the consumer pays a monthly fee to use the applications. Alternately, the selected application software can be a license or purchase associated with the earpiece 701 whereby a one time fee is paid. The consumer can discuss the terms of the agreement with the sales person or review the contract on the computer. If the consumer approves, the contract is signed or accepted and an initial payment is made for the hardware and software in a step 713. The consumer can have the subscription payment automatically billed to a credit card or other automatic payment methodology to ensure uninterrupted service.


The purchased earpiece is then connected via a wired or wireless connection to download the selected software and enabled in a step 715. A wired connection 717 such as a mini USB cable is shown in the illustration. Wired connection 717 connects to a server system having the appropriate software. In at least one exemplary embodiment, the selections from step 711 can be stored in memory and used to direct the downloading of software to the device, verify the software can be used in the device, and that the appropriate software has been loaded.



FIG. 7b is a diagram illustrating a consumer purchase process for a personalized sound management product in accordance with at least one exemplary embodiment. A consumer 702 can purchase personalized sound management product hardware 704 and personalized sound management application software 706. Software 706 works with hardware 704 to personalize and manage sound for consumer 702 for providing utility, safety, and health benefits.


Hardware 704 can range from a complete product such as a communication device, sound device, or earpiece to ancillary components for enhancing the look of the device or adding further hardware features. In the event that consumer 702 purchases hardware 704, customer information 708 is required to continue with purchase. In general, customer name, address, personal information and financial information are typically required to purchase hardware 704. Other information can also be collected through a questionnaire or some form of incentive. Should customer information 708 exist and consumer 702 is in good stead then generating customer information 708 is not required and the order process continues. Any issue that occurs in checking customer information 708 will be brought up for rectification to consumer 702. The hardware 710 that consumer 702 selected is verified and place order 712 is performed.


Personalized sound management application software 706 is a suite of applications that consumer 702 selects for their specific needs. Once customer information 708 has been established, consumer 708 is provided subscription information 714. Use of personalized sound management application software 706 is based on a periodic fee or subscription (for example, a monthly fee). Subscription information 714 informs consumer 702 as to the periodic subscription fees associated with the software 706, terms of use, bundled packages of applications, and other pertinent information to complete the decision process. Should consumer 702 approve of the terms, a contract 716 is provided to the customer to accept or decline. A bill 718 is generated when an order is placed either through purchasing hardware 710, accepting a subscription to personalized sound management application software 706, or both. In at least one exemplary embodiment, the purchase process is web based allowing consumer 702 to purchase from a client system via a communication path such as the internet or communication network.



FIG. 8 is a diagram illustrating registering a new device that includes personalized sound management applications in accordance with at least one exemplary embodiment. A new device or unit is received by a customer 802. The manufacturer has loaded the preordered personalized sound management applications chosen by customer 802 into the unit. Typically, customer 802 is going to want to try the unit shortly after receiving it (e.g., the unit is operational for a predetermined period of time 804, for example 1 week). In at least one exemplary embodiment, customer 802 can try the unit for a predetermined time period 804. In general, the time period 804 (for example, 15-30 minutes) is long enough for customer 802 to get a feel for the unit but is not a useful length of time. Furthermore, the unit will notify customer 802 through a transducer on the unit that they must register the device and that the unit will not be usable after the predetermined time period. Note that in at least one exemplary embodiment the customer would have the opportunity to demo certain features (806) not purchased.


In at least one exemplary embodiment, a website 810 is a portal to register the unit purchased by customer 802. Website 810 can have a web page devoted to registering new devices to simplify the process. Customer 802 may log in (if an account already exists) or provide personal information related to the registering process. A database 812 is accessed that includes both customer and order information in one or more databases. In at least one exemplary embodiment, the unit is in communication with the device manufacturer server 814 through website 810 or coupled thereto through a customer system.


Verification 816 verifies that all information is correct and that there are no pending issues 818 with a corrective action identified. The verification process 816 can checks things like device serial number, features and applications ordered by customer 802, personal/financial information. Once verification 816 is complete, it is determined at step 817 whether to register the device. Step 817 proceeds to step 820 if there are no pending issues 818. Registration 820 registers the unit to the owner after verification 816 is complete.



FIG. 9 is a diagram illustrating enabling a device in accordance with at least one exemplary embodiment. In general, registration 902 links customer information to a device as disclosed in FIG. 8. The registration process includes a comprehensive legally-compliant, state-specific Informed Consent system for collecting and storing the Informed Consent of users.


In at least one exemplary embodiment, the device will have at least one microphone. In the process of registration 902 the customer voice is recorded by the device. The customer voice can be stored on the device and to the customer database. The customer voice can be used for a number of applications including voice verification to use the device thereby acting as a deterrent for others to use or steal the device.


As mentioned in FIG. 8 the unit or device can be enabled for a predetermined time period. Prior to enabling the new device, a contract 904 is provided for customer review. The contract outlines the terms of use for using the device and that the product had been certified through a rigorous testing process. In at least one exemplary embodiment, liability coverage may be provided to the customer if the contract is accepted. Having the liability coverage in conjunction with the certification process is powerful in building consumer confidence about personalized sound management.


At step 905, it is determined whether the contract is accepted. The new device is not enabled 906 if the customer does not accept contract 904. The new device is enabled 908 if the customer accepts contract 904. In at least one exemplary embodiment, the registration and enabling of the unit is through a client system coupled to the business server. Thus, the unit is enabled remotely.



FIG. 10 is a diagram illustrating updating a unit or device in accordance with at least one exemplary embodiment. The device is enabled for use. In general, instructions or an operational video may be provided for a user to get acquainted with the device. Even with this help, a device may have so many features and applications that the user could get overwhelmed or frustrated trying to understand and operate the unit. In some cases, a user may not want even try to learn how to operate all the features and initially start out with the application that provides significant utility to the user. The user may never address learning how to use all the capabilities of the device. Moreover, misuse of a device can lead to dangerous operating conditions that could cause a serious or harmful situation.


Acclimation 1002 is a process to ensure that a user of a device appreciates the entire feature set but more particularly, enables the features in a way that ensures proper usage in a safe and structured manner. In at least one exemplary embodiment, acclimation 1002 comprises two components, a learning module and a reminder module. The learning module is a system that “teaches” the user of the device the functional components of the device. In at least one exemplary embodiment, the learning module may incorporate a learning system that evaluates the user in the domain of a particular competency before certain functions or features of the device are enabled. For example, features of the device are enabled sequentially, either automatically after the user has utilized a particular function for a predetermined time (thereby showing competency in using the feature) or following a user competency evaluation, which can be invoked either manually (when the user is confident in using the feature) or automatically.


The reminder module may or may not be used in acclimation 1002. The reminder serves to remind the user of the correct usage of features and also serves to remind the user of other features on the device which the user may not have used in a given time period. In at least one exemplary embodiment an evaluation is invoked based on a predetermined time period. The evaluation may be for a single function or feature or a combination of different functions. The predetermined time period before evaluating can vary depending on the function. Alternatively, the evaluation may be invoked based on a particular recent event, such as a software upgrade, or a detected change in user usage. If the user passes the evaluation, the evaluation may be repeated, the relevant learning mode for the corresponding function to which the users competency was evaluated may be repeated, or the corresponding function to which the users competency was evaluated may be disabled. Thus, acclimation 1002 allows a user to learn the device at a friendly pace, ensures that the user does know how to use a feature, and protects the user from using the device in a manner that could be harmful.


The device can be coupled 1004 from a client system to the business server to update software on the device or purchase new hardware and software. As disclosed hereinabove, the user information is checked to determine if there are any issues that need to be rectified (e.g. missed payment, user does not match to device, etc. . . . ). In the event that the issues cannot be rectified the device is disabled in a step 1006. In at least one exemplary embodiment, the user information, device, and personalized sound management applications correlate. The server checks for updates 1008 for the device and downloads updates to the device. The user goes through a similar process as described in FIGS. 7A-7B for a hardware or software purchase 1010 including agreeing to a contract to subscribe and contract to terms of use as described in FIG. 9.



FIG. 11 is a diagram illustrating a device for implementing personalized sound management in accordance with at least one exemplary embodiment. The device is generally indicated as an earpiece that partially seals or seals a user's ear canal 1124 and is constructed and operates in accordance with at least one exemplary embodiment of the invention. As illustrated, the earpiece comprises an electronic housing unit 1100 and a sealing unit 1108. The earpiece depicts an electro-acoustical assembly for an in-the-ear acoustic assembly, as it would typically be placed in an ear canal 1124 of a user 1130. The earpiece can be an in the ear earpiece, behind the ear earpiece, receiver in the ear, partial-fit device, or any other suitable earpiece type. The earpiece can partially or fully occlude ear canal 1124, and is suitable for use with users having healthy or abnormal auditory functioning.


The earpiece includes an Ambient Sound Microphone (ASM) 1120 to capture ambient sound, an Ear Canal Receiver (ECR) 1114 to deliver audio to an ear canal 1124, and an Ear Canal Microphone (ECM) 1106 to capture and assess a sound exposure level within the ear canal 1124. The earpiece can partially or fully occlude the ear canal 1124 to provide various degrees of acoustic isolation. In at least one exemplary embodiment, the assembly is designed to be inserted into the user's ear canal 1124, and to form an acoustic seal with the walls of the ear canal 1124 at a location between the entrance to the ear canal 1124 and the tympanic membrane 1126 (or ear drum). In general, such a seal is typically achieved by means of a soft and compliant housing of sealing unit 1108.


Sealing unit 1108 is an acoustic barrier having a first side corresponding to ear canal 1124 and a second side corresponding to the ambient environment. In at least one exemplary embodiment, sealing unit 1108 includes an ear canal microphone tube 1112 and an ear canal receiver tube 1110. Sealing unit 1108 creates a closed cavity of approximately 5 cc between the first side of sealing unit 1108 and the tympanic membrane 1126 in ear canal 1124. As a result of this sealing, the ECR (speaker) 1114 is able to generate a full range bass response when reproducing sounds for the user. This seal also serves to significantly reduce the sound pressure level at the user's eardrum 1126 resulting from the sound field at the entrance to the ear canal 1124. This seal is also a basis for a sound isolating performance of the electro-acoustic assembly.


In at least one exemplary embodiment and in broader context, the second side of sealing unit 1108 corresponds to the earpiece, electronic housing unit 1100, and ambient sound microphone 1120 that is exposed to the ambient environment. Ambient sound microphone 1120 receives ambient sound from the ambient environment around the user.


Electronic housing unit 1100 houses system components such as a microprocessor 1116, memory 1104, battery 1102, ECM 1106, ASM 1120, ECR 1114, and user interface 1122. Microprocessor 1116 (or processor 1116) can be a logic circuit, a digital signal processor, controller, or the like for performing calculations and operations for the earpiece. Microprocessor 1116 is operatively coupled to memory 1104, ECM 1106, ASM 1120, ECR 1114, and user interface 1122. A wire 1118 provides an external connection to the earpiece. Battery 1102 powers the circuits and transducers of the earpiece. Battery 1102 can be a rechargeable or replaceable battery.


In at least one exemplary embodiment, electronic housing unit 1100 is adjacent to sealing unit 1108. Openings in electronic housing unit 1100 receive ECM tube 1112 and ECR tube 1110 to respectively couple to ECM 1106 and ECR 1114. ECR tube 1110 and ECM tube 1112 acoustically couple signals to and from ear canal 1124. For example, ECR 1114 outputs an acoustic signal through ECR tube 1110 and into ear canal 1124 where it is received by the tympanic membrane 1126 of the user of the earpiece. Conversely, ECM 1106 receives an acoustic signal present in ear canal 1124 though ECM tube 1112. All transducers shown can receive or transmit audio signals to a processor 1116 that undertakes audio signal processing and provides a transceiver for audio via the wired (wire 1118) or a wireless communication path.


The earpiece can actively monitor a sound pressure level both inside and outside an ear canal 1124 and enhance spatial and timbral sound quality while maintaining supervision to ensure safe sound reproduction levels. The earpiece in various embodiments can conduct listening tests, filter sounds in the environment, monitor warning sounds in the environment, present notification based on identified warning sounds, maintain constant audio content to ambient sound levels, and filter sound in accordance with a Personalized Hearing Level (PHL).


The earpiece can generate an Ear Canal Transfer Function (ECTF) to model the ear canal 1124 using ECR 1114 and ECM 1106, as well as an Outer Ear Canal Transfer function (OETF) using ASM 1120. For instance, the ECR 1114 can deliver an impulse within the ear canal 1124 and generate the ECTF via cross correlation of the impulse with the impulse response of the ear canal 1124. The earpiece can also determine a sealing profile with the user's ear to compensate for any leakage. It also includes a Sound Pressure Level Dosimeter to estimate sound exposure and recovery times. This permits the earpiece to safely administer and monitor sound exposure to the ear.


In at least one exemplary embodiment, the earpiece has a number of sonic signatures stored in memory. ASM 1120 is providing acoustic information from the ambient environment to processor 1116. Processor 1116 analyses the acoustic information for a sound similar to the sonic signature. Once identified, the earpiece will provide a response to the sound based on the application. In a first exemplary embodiment, the earpiece will reduce music or telephone call (or the dominant source of sound being provided by the earpiece) and amplify the identified signal (ambulance or police car) thereby notifying the user of the approaching vehicle. In a second exemplary embodiment, the earpiece will tell the user (through a synthesized voice) that an ambulance or police car is approaching including the direction of the vehicle. The earpiece can also provide the identified signal with the voice warning. Other variations are possible.


Conversely, the earpiece can perform the opposite operation. The earpiece can identify a signal similar to the sonic signature and then attenuate it before providing it through ECR 1114. For example, the user of the earpiece is a gun enthusiast. The user downloads a sonic signature related to a gun shot. The earpiece upon identifying the sound of the gun shot would attenuate the portion of the acoustic information provided by ASM 1120 similar to the sonic signature of the gun shot while allowing other signals to come through. Thus, the user could engage in a conversation at the gun range with the gun shot sounds attenuated while passing the conversation through the earpiece thereby protecting his ear from the loud sounds in this environment and being able to hear the conversation with more clarity.


In at least one exemplary embodiment, the earpiece can manually or automatically record, measure sound pressure levels, attach metadata (including time stamp and geocode), and upload the information when a communication path is present to a sound database. The earpiece is capable of running personalized sound management software disclosed herein. Moreover, hardware ingredients such as the transducers (ECR 1114, ASM 1120, and ECM 1106), processor 1116, and sealing unit 1108 are provided to manufacturers as disclosed in FIG. 3 for allowing a variety of companies to manufacture devices of their own earpiece designs. It should be noted that although an earpiece is used as an example of a personalized sound management device the components can be used in other communication devices and sound devices to personalize control of the sound.


In at least one exemplary embodiment, sealing section 1108 is a replaceable unit. Sealing section 1108 can pull out and be replaced as a unit. Sealing section 1108 performance can degrade over time due to particulate build up. There may also be sanitary or health reasons for replacing sealing section 1108 periodically. In at least one exemplary embodiment, sealing section 1108 replacement parts can be provided as part of a periodic subscription fee to a user, purchased over the counter in a store, or purchased through a web environment.



FIG. 12 is a block diagram of a device for implementing personalized sound management in accordance with at least one exemplary embodiment. A power supply 1205 powers components of the earpiece including microprocessor/DSP 1206 (or processor 1206) and a data communication system 1216. As illustrated, the earpiece can include the processor 1206 operatively coupled through a data communication system 1216 to an ASM 1210, an ECR 1212, and ECM 1208. Data communication system 1216 can include one or more Analog to Digital Converters and Digital to Analog Converters (DAC). The processor 1206 can utilize computing technologies such as a microprocessor, Application Specific Integrated Chip (ASIC), and/or digital signal processor (DSP) with associated Random Access Memory (RAM) 1202 and Read Only Memory 1204. Other memory types such as Flash, non-volatile memory, SRAM, DRAM or other like technologies can be used for storage with processor 1206. The processor 1206 can also include a clock to record a time stamp.


In general, a data communication system 1216 is a communication pathway to components of the earpiece and components external to the earpiece. The communication link can be wired or wireless. In at least one exemplary embodiment, data communication system 1216 is configured to communicate with ECM assembly 1208, ASM assembly 1210, visual display 1218, and user control interface 1214 of the earpiece. As shown, user control interface 1214 can be wired or wirelessly connected. In at least one exemplary embodiment, data communication system 1216 is capable of communication to devices exterior to the earpiece such as the user's mobile phone 1232, a second earpiece 1222, and a portable media player 1228. Portable media player 1228 can be controlled by a manual user control 1230.


The user's mobile phone includes a mobile phone 1232 communication system 1224. A microprocessor 1226 is operatively coupled to mobile phone communication system 1224. As illustrated multiple devices can be wirelessly connected to one another such as an earpiece 1220 worn by another person to the user's mobile phone 1232. Similarly, the user's mobile phone 1232 can be connected to the data communication system 1216 of the earpiece as well as the second earpiece 1222. This connection would allow one or more people to listen and respond to a call on the user's mobile phone 1232 through their respective earpieces.


As illustrated, a data communication system 1216 can include a voice operated control (VOX) module to provide voice control to one or more subsystems, such as a voice recognition system, a voice dictation system, a voice recorder, or any other voice related processor. The VOX module can also serve as a switch to indicate to the subsystem a presence of spoken voice and a voice activity level of the spoken voice. The VOX can be a hardware component implemented by discrete or analog electronic components or a software component. In one arrangement, the processor 1206 can provide functionality of the VOX by way of software, such as program code, assembly language, or machine language.


ROM 1204 can be used to store personalized sound management applications to minimize the possibility of modification and tampering of the code. The RAM 1202 can also store program instructions for execution on the processor 1206 as well as captured audio processing data. For instance, memory RAM 1202 and ROM 1204 can be off-chip and external to the processor 1206 and include a data buffer to temporarily capture the ambient sound and the internal sound, and a storage memory to save from the data buffer the recent portion of the history in a compressed format responsive to a directive by the processor. The data buffer can be a circular buffer that temporarily stores audio sound at a current time point to a previous time point. It should also be noted that the data buffer can in one configuration reside on the processor 1206 to provide high speed data access. The storage memory can be non-volatile memory such as SRAM to store captured or compressed audio data. The non-volatile memory could also be used to store sonic signatures.


Data communication system 1216 can include an audio interface operatively coupled to the processor 1206 and the VOX to receive audio content, for example from portable media player 1228, cell phone 1232, or any other communication device, and deliver the audio content to the processor 1206. The processor 1206 responsive to detecting voice-operated events from the VOX can adjust the audio content delivered to the ear canal of the user of the earpiece. For instance, the processor 1206 (or the VOX of data communication system 1216) can lower a volume of the audio content responsive to detecting an event such as a sonic signature for transmitting the acute sound to the ear canal of the user. The processor 1206 by way of the ECM 1208 can also actively monitor the sound exposure level inside the ear canal and adjust the audio to within a safe and subjectively optimized listening level range based on voice operating decisions made by the VOX of data communication system 1216.


The earpiece and data communication system 1216 can further include a transceiver that can support singly or in combination any number of wireless access technologies including without limitation Bluetooth™, Wireless Fidelity (WiFi), Worldwide Interoperability for Microwave Access (WiMAX), and/or other short or long range communication protocols. The transceiver can also provide support for dynamic downloading and uploading over-the-air to the earpiece. It should be noted also that next generation access technologies also can be applied to the present disclosure.


Data communication system 1216 can also include a location receiver that utilizes common technology such as a common GPS (Global Positioning System) receiver that can intercept satellite signals and therefrom determine a location fix of the earpiece and provide a geocode as an identifier for a recording or measurement such as sound pressure level.


The power supply 1205 can utilize common power management technologies such as replaceable batteries, supply regulation technologies, and charging system technologies for supplying energy to the components of the earpiece and to facilitate portable applications. A motor (not shown) can be a single supply motor driver coupled to the power supply 1205 to improve sensory input via haptic vibration. As an example, the processor 1206 can direct the motor to vibrate responsive to an action, such as a detection of a warning sound or an incoming voice call.


The earpiece can further represent a single operational device or a family of devices configured in a master-slave arrangement, for example, a mobile device and an earpiece. In the latter embodiment, the components of the earpiece can be reused in different form factors for the master and slave devices.



FIG. 13 is a diagram of a communication device 1302 or earpiece 1310 configured to provide sonic signatures to sonic signature database 1308 in accordance with at least one exemplary embodiment. Collecting a large number of sounds around the world is a daunting task. As mentioned previously, no group or business entity would have the ability to acoustically map the world on a continuous basis. In at least one exemplary embodiment, the collection of sonic signatures is achieved by mobilizing as many people as possible by making it simple to capture and provide a sound for sonic signature database 1308. This is achieved by adapting a common device having a potential of reaching billions of people for manually or automatically capturing sonic signatures and providing them to database 1308.


In at least one exemplary embodiment, communication device 1302 is a mobile communication device having a microphone for receiving sound. Examples of a communication device 1302 are a phone, cell phone, PDA, computer, two way radio, smart phone, and earpiece. Earpiece 1310 includes a microphone for capturing sound and the circuitry disclosed in FIG. 12. Earpiece 1310 is operably configured for recording sound and measuring sound pressure levels. The acoustic information received from the microphone of earpiece 1310 is recorded to memory residing in earpiece 1310 and tagged with metadata including a time stamp and geocode. The sound pressure level (SPL) is measured from the microphone signal through analog or digital signal processing typically before any audio processing occurs within earpiece 1310. The SPL measurement of this recorded event is associated to the stored recording and metadata. Similarly, communication device 1302 is adapted for capturing sound 1312 including recording, measuring sound pressure level, storing, and tagging with metadata (including a time stamp and geocode). In at least one embodiment, communication device 1302 could have additional circuitry similar to that in FIG. 12 in the device specifically for generating a sonic signature database. Alternately, the circuitry within communication device 1302 can be adapted for recording sound and measuring sound pressure level. In particular, mechanical aspects of the communication device such as microphone placement, microphone porting, as well as electronic audio processing (automatic gain control, equalization, etc. . . . ) is taken into account for ensuring an accurate sound pressure level measurement and recording.


In at least one exemplary embodiment, a user having communication device 1302 manually or automatically captures and provides a sonic signature to a sonic signature database 1308. For example, a microphone on communication device 1302 is always enabled for receiving sound. The sound can be stored in a buffer. The sound in the buffer is analyzed and based on a variety of criteria can be configured to be provided to database 1308. For example, criteria such as sound pressure level, time, frequency of sound, geographic location, or recognition of a sound (sonic signature detection) are but a few of the parameters that could be used to determine that the sound in the buffer is worthy of saving. Metadata is automatically attached such as a time stamp and geocode but the user can also add information. In at least one exemplary embodiment, a communication path 1304 is opened and a link is made to website 1306 and more particularly to database 1308 where the stored sounds can be automatically uploaded. In at least one exemplary embodiment, the sonic signature, sound pressure level, and metadata could be immediately sent if a communication path 1304 is available to save memory. Further communication between website 1308 and the user of communication device 1302 or earpiece 1310 can take place to edit, identify, describe, and format the provided sound 1312 at a more convenient time. It should be noted that video information that includes audio information can also be provided in similar fashion as disclosed hereinabove. The audio information from the video can be used for sonic signature database 1308.


Earpiece 1310 and communication device 1302 can be operably coupled together. A priority could be set up such that earpiece 1310 is the primary recorder of sound 1312 when enabled by the user. Earpiece 1310 can be used with other devices for example a portable media player. Earpiece 1310 can automatically or manually record, measure SPL, and tag metadata of sound 1312 as described hereinabove. The sonic signatures stored in earpiece 1310 could be sent to website 1306 and sonic signature database 1308 if earpiece 1310 is coupled to a communication path 1304 or through another device to which it is operably coupled that has a communication path. Alternately, the sonic signatures could be uploaded via website 1306 and sonic signature database 1308 at a more convenient time, for example a wired or wireless link to the user's personal computer at home, allowing the user to also provide additional metadata before providing the information. Thus, a common device has been provided that is adapted for capturing, storing, measuring SPL, adding metadata including a time stamp and geocode, and uploading the acoustic information to a database thereby including the broadest number of people across the largest geographic area for sound collection on a continuous basis.



FIG. 14 is a block diagram illustrating a cell phone 1404 capturing a sonic signature and providing the sonic signature to a database of sounds 1418 in accordance with at least one exemplary embodiment. Cell phone 1404 is enabled for capturing a sonic signature via an exclusive button or through an automatic process. In at least one exemplary embodiment, the user can select whether a sound is recorded, a sound pressure level measurement is recorded, or both are acquired. A default can be that both are acquired automatically.


A sound 1402 is received by a microphone on cell phone 1404 and stored in a buffer in a record sounds step 1406. The sound in the buffer is analyzed and determined to be saved. The sound pressure level (SPL) of the sonic signature is measured or calculated in a measure SPL step 1408. The user can also manually enter metadata 1410 via a keyboard to a metadata table or can enter vocal description in an attached audio stream. Metadata 1410 includes a time stamp and a geocode corresponding to the sonic signature.


In at least one exemplary embodiment, the sonic signature, sound pressure level, and metadata can be stored in memory 1412 that resides on cell phone 1404. A queue of sonic signatures 1414 can be stored in memory 1412 for uploading at an appropriate time. The user can initiate uploading of the queue of sonic signatures 1414 to database of sounds 1418 when a communication path is completed. In at least one exemplary embodiment, cell phone 1404 can automatically connect 1416 to servers in database of sounds 1418 and upload queue of sonic signatures 1414 when a communication path is enabled. Although stored on database of sounds 1418, there may be an iterative process to determine if the sonic signatures are in the correct format or are unique enough to be permanently stored. Thus, a large database of sounds 1418 can be collected world wide by an automatic process using a common device such as a cell phone that could not be accomplished by direct means. In at least one exemplary embodiment, the database of sounds 1418 is used in conjunction with personalized sound management applications configured to provide sonic signatures for identifying and providing a response to the identified sound.


To encourage retailers to actively engage and participate in selling Hearium Labs branded products, a business model has been developed based on an annuity revenue sharing methodology. Through this patent pending approach, for the first time, retailers, operators and other suppliers will be able to carry a single audio earpiece or other device in their stores that can be sold to a wide and deep range of consumers. Utilizing an “iPod/iTunes” approach, suppliers will earn profit off the sale of the hardware in store and Personalized Sound Management applications at and after the point of sale. As the consumer continues to personalize their earpiece or device over time, the supplier will earn residuals of up to 50% of the application revenues through a co-operative sharing program between them and Hearium Labs.


To accomplish this, Hearium Labs has enabled the hardware solution to work off of a webified environment that is being architected with commercial flexibility for consumer driven point of sale purchase and activation. This approach allows for the Personalized Sound Management applications to be remotely downloaded and activated on the hardware through any Internet equipped PC and web-based browser. Consumers will thus be able to “enhance” their devices by adding both purchased and subscription applications and personalize their experience with technology designed to acclimate devices to the wearer and their environment.


In general, a process has been provided that eliminates barriers for mass adoption of a new technology. The result is that the technology can be provided by a number of manufacturers thereby having ubiquitous availability at release. Universal compatibility is also provided by using the same hardware and software in each new product having personalized sound management. Certification ensures that each device is manufacturable and performs to specification. Manufacturers and consumers benefit because liability coverage may be provided. Consumers can adapt to the technology rapidly through an acclimation process.


While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.

Claims
  • 1. A device comprising: a microphone configured to measure sound and generate a microphone signal;a user interface;a memory configured to store instructions;a data buffer configured to store at least a portion of the microphone signal;a second memory configured to store data;a processor operatively coupled to the memory, the processor operatively coupled to the audio buffer, the processor operatively coupled to the data buffer, wherein the processor is configured to execute the instructions to perform operations comprising: receiving the microphone signal;sending a portion of the microphone signal to the data buffer;analyzing the data buffer for detecting a sonic signature;retrieving sound data from the microphone if the sonic signature is detected;sending a portion of the sound data to a remote server;determining a user request in the portion of the sound data, wherein the determination is accomplished by the remote server or a second server;retrieving a first data in response to the user request, wherein the remote server or the second server or a third server retrieves the first data;sending the first data to the device and storing the first data in the second memory;retrieving the first data from the second memory; andsending the first data to the user interface.
  • 2. The device according to claim 1, wherein the user interface is a speaker.
  • 3. The device according to claim 2, wherein the operations further comprise: converting the first data into a first acoustic signal if the first data is not already in the form of an acoustic signal.
  • 4. The device according to claim 3, wherein the operations further comprise: sending the first acoustic signal to the speaker.
  • 5. The device according to claim 1, wherein the user interface is a display.
  • 6. The device according to claim 5, wherein the operations further comprise: converting the first data into first visual data if the first data is not already in the form of visual data.
  • 7. The device according to claim 6, wherein the operations further comprise: sending the first visual data to the display.
  • 8. The device according to claim 1, wherein the sonic signature is at least one word.
  • 9. The device according to claim 8, wherein the user request is a verbal request from the user to retrieve at least one of a song, weather predictions, news, a video, or a combination thereof.
  • 10. The device according to claim 1, wherein the user interface is an earphone.
  • 11. The device according to claim 1, wherein the portion of the audio data includes meta data.
  • 12. The device according to claim 1, wherein the first data includes meta data.
  • 13. The device according to claim 1, wherein the user request is determined by the remote server by detecting additional sonic signatures in the portion of the sound data.
  • 14. The device of claim 1, wherein the user request is determined by use of a gaussian mixture model or learned model.
  • 15. A method comprising: receiving a microphone signal;sending a portion of the microphone signal to an data buffer;analyzing the data buffer for detecting a sonic signature wherein the sonic signature is at least one word;recording sound data if the sonic signature is detected;adding meta data to a portion of the sound data to generate a mixed data signal;sending the mixed data signal to a remote server;determining a user request from the mixed data signal, wherein the determination is accomplished by the remote server;retrieving a first data in response to the user request, wherein the remote server retrieves the first data;sending the first data from the remote server to the device and storing the first data in a memory;retrieving the first data from the memory; andsending the first data to the user interface.
  • 16. The method according to claim 15, wherein the user interface is a speaker.
  • 17. The method according to claim 16, further comprising: converting the first data into a first acoustic signal if the first data is not already in the form of an acoustic signal; andsending the first acoustic signal to the speaker.
  • 18. The method according to claim 15, wherein the user interface is a display.
  • 19. The method according to claim 18, further comprising: converting the first data into first visual data if the first data is not already in the form of visual data; andsending the first visual data to the display.
  • 20. The method according to claim 15, wherein the user request is determined by the remote server by detecting additional sonic signatures in the portion of the sound data.
CROSS-REFERENCE TO RELATED APPLICATIONS

This application is a continuation of and claims priority to U.S. application Ser. No. 17/736,180, filed 4 May 2022, which is a continuation of and claims priority benefit to U.S. application Ser. No. 17/203,731, filed 16 Mar. 2021, now U.S. patent Ser. No. 11/443,746, which is a continuation of U.S. application Ser. No. 16/671,689, filed 1 Nov. 2019, now U.S. patent Ser. No. 10/997,978, which is a continuation of U.S. application Ser. No. 14/846,994, filed 7 Sep. 2015, now U.S. patent Ser. No. 10/529,325, which is a continuation of U.S. application Ser. No. 12/560,097 filed on 15 Sep. 2009, now U.S. Pat. No. 9,129,291, and claims the benefit of U.S. provisional patent application No. 61/098,914 filed 22 Sep. 2008. The disclosure of the aforementioned applications are all incorporated herein by reference in their entireties and priority claimed to all.

US Referenced Citations (686)
Number Name Date Kind
2803308 Mattia Aug 1957 A
3028454 Kohorn Apr 1962 A
3729598 Tegt Apr 1973 A
3876843 Moen Apr 1975 A
4041256 Ohta Aug 1977 A
4054749 Suzuki et al. Oct 1977 A
4088849 Usami et al. May 1978 A
4533795 Baumhauer Aug 1985 A
4555677 Beesley Nov 1985 A
4596902 Gilman Jun 1986 A
4941187 Slater Jul 1990 A
4947440 Bateman et al. Aug 1990 A
5002151 Oliveira et al. Mar 1991 A
5033090 Weinrich Jul 1991 A
5204906 Nohara Apr 1993 A
5208867 Stites, III May 1993 A
5251263 Andrea Oct 1993 A
5259033 Goodings Nov 1993 A
5267321 Langberg Nov 1993 A
5276740 Inanaga et al. Jan 1994 A
5298692 Ikeda Mar 1994 A
5317273 Hanson May 1994 A
5327506 Stites Jul 1994 A
5345430 Moe Sep 1994 A
5390254 Adelman Feb 1995 A
5430826 Webster Jul 1995 A
5473684 Bartlett Dec 1995 A
5479522 Lindemann Dec 1995 A
5524056 Killion et al. Jun 1996 A
5526819 Martin Jun 1996 A
5528739 Lucas et al. Jun 1996 A
5539831 Harley Jul 1996 A
5550923 Hotvet Aug 1996 A
5557659 Hyde-Thomson Sep 1996 A
5577511 Killion Nov 1996 A
5632002 Hashimoto et al. May 1997 A
5649055 Gupta Jul 1997 A
5692059 Kruger Nov 1997 A
5764778 Zurek Jun 1998 A
5787187 Bouchard Jul 1998 A
5799273 Mitchell et al. Aug 1998 A
5887070 Iseberg Mar 1999 A
5903868 Yuen et al. May 1999 A
5909667 Leontiades et al. Jun 1999 A
5920835 Huzenlaub et al. Jul 1999 A
5923624 Groeger Jul 1999 A
5933506 Aoki Aug 1999 A
5933510 Bryant Aug 1999 A
5937070 Todter Aug 1999 A
5946050 Wolff Aug 1999 A
5953392 Rhie et al. Sep 1999 A
5956681 Yamakita Sep 1999 A
6005525 Kivela Dec 1999 A
6021205 Yamada Feb 2000 A
6021207 Puthuff et al. Feb 2000 A
6021325 Hall Feb 2000 A
6028514 Lemelson Feb 2000 A
6056698 Iseberg May 2000 A
6069963 Martin May 2000 A
6072645 Sprague Jun 2000 A
6094494 Haroldson Jun 2000 A
6094492 Boesen Jul 2000 A
6101256 Steelman Aug 2000 A
6118877 Lindemann Sep 2000 A
6118878 Jones Sep 2000 A
6141426 Stobba Oct 2000 A
6151571 Pertrushin Nov 2000 A
6160758 Spiesberger Dec 2000 A
6163338 Johnson et al. Dec 2000 A
6163508 Kim et al. Dec 2000 A
6173259 Bijl Jan 2001 B1
6175633 Morrill Jan 2001 B1
6198971 Leysieffer Mar 2001 B1
6226389 Lemelson et al. May 2001 B1
6263147 Tognazzini Jul 2001 B1
6269161 McLaughlin Jul 2001 B1
6298323 Kaemmerer Oct 2001 B1
6308158 Kuhnen et al. Oct 2001 B1
6311092 Yamada Oct 2001 B1
6338038 Hanson Jan 2002 B1
6359993 Brimhall Mar 2002 B2
6400652 Goldberg et al. Jun 2002 B1
6405165 Blum et al. Jun 2002 B1
6408272 White Jun 2002 B1
6415034 Hietanen Jul 2002 B1
6424721 Hohn Jul 2002 B1
6445799 Taenzer Sep 2002 B1
6456975 Chang Sep 2002 B1
6463413 Applebaum Oct 2002 B1
6475163 Smits Nov 2002 B1
6483899 Agraharam et al. Nov 2002 B2
6490557 Jeppesen Dec 2002 B1
6513621 Deslauriers et al. Feb 2003 B1
6526148 Jourjine Feb 2003 B1
6526381 Wilson Feb 2003 B1
6554761 Puria Apr 2003 B1
6567524 Svean et al. May 2003 B1
6593848 Atkins Jul 2003 B1
6597787 Lindgren Jul 2003 B1
6606598 Holthouse Aug 2003 B1
6639987 McIntosh Oct 2003 B2
6647123 Kandel Nov 2003 B2
6647368 Nemirovski Nov 2003 B2
6648368 Nemirovski Nov 2003 B2
RE38351 Iseberg Dec 2003 E
6658122 Westermann Dec 2003 B1
6661886 Huart Dec 2003 B1
6661901 Svean et al. Dec 2003 B1
6671379 Nemirovski Dec 2003 B2
6671643 Kachler Dec 2003 B2
6674862 Magilen Jan 2004 B1
6687339 Martin Feb 2004 B2
6687377 Voix Feb 2004 B2
6687671 Gudorf et al. Feb 2004 B2
6717991 Gustafsson Apr 2004 B1
6725194 Bartosik et al. Apr 2004 B1
6728385 Kvaloy et al. Apr 2004 B2
6738485 Boesen May 2004 B1
6748238 Lau Jun 2004 B1
6754359 Svean et al. Jun 2004 B1
6775206 Karhu Aug 2004 B2
6785394 Olsen Aug 2004 B1
6738482 Jaber Sep 2004 B1
6789060 Wolfe et al. Sep 2004 B1
6804638 Fiedler Oct 2004 B2
6804643 Kiss Oct 2004 B1
6837857 Stirnemenn Jan 2005 B2
6879692 Nielsen Apr 2005 B2
6910013 Allegro Jun 2005 B2
6912289 Vonlanthen Jun 2005 B2
6941161 Bobisuthi Sep 2005 B1
6987992 Hundal Jan 2006 B2
7003099 Zhang Feb 2006 B1
7003123 Kanevsky Feb 2006 B2
7020297 Fang Mar 2006 B2
7037274 Thornton May 2006 B2
7039195 Svean May 2006 B1
7039585 Wilmot May 2006 B2
7043037 Lichtblau May 2006 B2
7050592 Iseberg May 2006 B1
7050966 Schneider May 2006 B2
7050971 Kaufholz May 2006 B1
7072482 Van Doorn et al. Jul 2006 B2
7082393 Lahr Jul 2006 B2
7092532 Luo Aug 2006 B2
7103188 Jones Sep 2006 B1
7107109 Nathan et al. Sep 2006 B1
7110554 Brennan Sep 2006 B2
7130437 Stonikas et al. Oct 2006 B2
7158933 Balan Jan 2007 B2
7162041 Haapapuro Jan 2007 B2
7174022 Zhan Feb 2007 B1
7177433 Sibbald Feb 2007 B2
7181020 Riley Feb 2007 B1
7181030 Rasmussen Feb 2007 B2
7209569 Boesen Apr 2007 B2
7215766 Wurtz May 2007 B2
7223245 Zoth May 2007 B2
7246058 Burnett Jul 2007 B2
7277722 Rosenzweig Oct 2007 B2
7280849 Bailey Oct 2007 B1
7312699 Chornenky Dec 2007 B2
7346504 Liu Mar 2008 B2
7359504 Reuss Apr 2008 B1
7383178 Visser Jun 2008 B2
7395090 Alden Jul 2008 B2
7430299 Armstrong et al. Sep 2008 B2
7430300 Vosburgh Sep 2008 B2
7433463 Alves Oct 2008 B2
7433714 Howard et al. Oct 2008 B2
7444353 Chen Oct 2008 B1
7450730 Bertg et al. Nov 2008 B2
7464029 Visser Dec 2008 B2
7477754 Rasmussen Jan 2009 B2
7477756 Wickstrom et al. Jan 2009 B2
7477922 Lewis Jan 2009 B2
7502484 Ngia Mar 2009 B2
7512245 Rasmussen Mar 2009 B2
7519193 Fretz Apr 2009 B2
7529379 Zurek May 2009 B2
7532734 Pham May 2009 B2
7536006 Patel May 2009 B2
7562020 Le et al. Jul 2009 B2
7574917 Von Dach Aug 2009 B2
7590254 Olsen Sep 2009 B2
7617099 Yang Nov 2009 B2
7623823 Zito Nov 2009 B2
7634094 Reber Dec 2009 B2
7659827 Gunderson Feb 2010 B2
7680465 Issa Mar 2010 B2
7702482 Graepel et al. Apr 2010 B2
7710654 Ashkenazi May 2010 B2
7715568 Nakano May 2010 B2
7715577 Allen May 2010 B2
7729912 Bacchiani Jun 2010 B1
7756281 Goldstein et al. Jul 2010 B2
7756283 Bramslow Jul 2010 B2
7756285 Sjursen et al. Jul 2010 B2
7773743 Stokes Aug 2010 B2
7773759 Alves Aug 2010 B2
7773763 Pedersen Aug 2010 B2
7774202 Spengler Aug 2010 B2
7778434 Juneau et al. Aug 2010 B2
7801318 Bartel Sep 2010 B2
7801726 Ariu Sep 2010 B2
7804974 Paludan-Muller Sep 2010 B2
7813520 Dach Oct 2010 B2
7817808 Konchitsky Oct 2010 B2
7844070 Abolfathi Nov 2010 B2
7844248 Sotack Nov 2010 B2
7853031 Hamacher Dec 2010 B2
7861723 Dedrick Jan 2011 B2
7869606 Fichtl Jan 2011 B2
7903825 Melanson Mar 2011 B1
7903826 Boersma Mar 2011 B2
7903833 Goldberg Mar 2011 B2
7920557 Moote Apr 2011 B2
7925007 Stokes Apr 2011 B2
7929713 Victorian Apr 2011 B2
7933423 Baekgaard Jensen et al. Apr 2011 B2
7936885 Frank May 2011 B2
7953241 Jorgensen May 2011 B2
7983433 Nemirovski Jul 2011 B2
7983907 Visser Jul 2011 B2
7986791 Bostick Jul 2011 B2
7986802 Ziller Jul 2011 B2
7995773 Mao Aug 2011 B2
8014553 Radivojevic et al. Sep 2011 B2
8018337 Jones Sep 2011 B2
8019091 Burnett Sep 2011 B2
8027481 Beard Sep 2011 B2
8045840 Murata et al. Oct 2011 B2
8047207 Perez Nov 2011 B2
8050143 Nicholas Nov 2011 B2
8068627 Zhan Nov 2011 B2
8077872 Dyer Dec 2011 B2
8081780 Goldstein Dec 2011 B2
8085943 Bizjak Dec 2011 B2
8086093 Stuckman Dec 2011 B2
8111839 Goldstein Feb 2012 B2
8111840 Haulick Feb 2012 B2
8111849 Tateno Feb 2012 B2
8116472 Mizuno Feb 2012 B2
8116489 Mejia Feb 2012 B2
8121301 Suzuki Feb 2012 B2
8140325 Kanevsky Mar 2012 B2
8144881 Crockett Mar 2012 B2
8144891 Her Mar 2012 B2
8150044 Goldstein Apr 2012 B2
8150084 Jessen Apr 2012 B2
8160261 Schulein Apr 2012 B2
8160273 Visser Apr 2012 B2
8162846 Epley Apr 2012 B2
8184823 Itabashi May 2012 B2
8186478 Grason May 2012 B1
8189803 Bergeron May 2012 B2
8194864 Goldstein et al. Jun 2012 B2
8194865 Goldstein Jun 2012 B2
8199919 Goldstein et al. Jun 2012 B2
8199942 Mao Jun 2012 B2
8204435 Seshadri Jun 2012 B2
8208609 Harris Jun 2012 B2
8208644 Goldstein et al. Jun 2012 B2
8208652 Keady Jun 2012 B2
8213629 Goldstein Jul 2012 B2
8218784 Schulein Jul 2012 B2
8221861 Keady Jul 2012 B2
8229128 Keady Jul 2012 B2
8229148 Rasmssen Jul 2012 B2
8229513 Ibe Jul 2012 B2
8251925 Staab et al. Aug 2012 B2
8254586 Voix Aug 2012 B2
8254591 Goldstein Aug 2012 B2
8270629 Bothra Sep 2012 B2
8270634 Harney Sep 2012 B2
8306235 Mahowald Nov 2012 B2
8312960 Keady Nov 2012 B2
8322222 Goldberg Dec 2012 B2
8340309 Burnett Dec 2012 B2
8351634 Khenkin Jan 2013 B2
8369901 Haulick Feb 2013 B2
8374361 Moon Feb 2013 B2
8385560 Solbeck Feb 2013 B2
8391534 Ambrose et al. Mar 2013 B2
8401200 Tiscareno Mar 2013 B2
8411880 Wang Apr 2013 B2
8437492 Goldstein et al. May 2013 B2
8447370 Ueda May 2013 B2
8462969 Claussen Jun 2013 B2
8462974 Jeong Jun 2013 B2
8472616 Jiang Jun 2013 B1
8477955 Engle Jul 2013 B2
8493204 Wong et al. Jul 2013 B2
8515089 Nicholson Aug 2013 B2
8522916 Keady Sep 2013 B2
8548181 Kraemer Oct 2013 B2
8550206 Keady et al. Oct 2013 B2
8554350 Keady et al. Oct 2013 B2
8577062 Goldstein Nov 2013 B2
8594341 Rothschild Nov 2013 B2
8600067 Usher et al. Dec 2013 B2
8611548 Bizjak Dec 2013 B2
8611560 Goldstein Dec 2013 B2
8625818 Stultz Jan 2014 B2
8625819 Goldstein Jan 2014 B2
8631801 Keady Jan 2014 B2
8649540 Killion et al. Feb 2014 B2
8652040 LeBoeuf Feb 2014 B2
8657064 Staab et al. Feb 2014 B2
8678011 Goldstein et al. Mar 2014 B2
8693704 Kim Apr 2014 B2
8718288 Woods May 2014 B2
8718305 Usher May 2014 B2
8718313 Keady May 2014 B2
8744091 Chen et al. Jun 2014 B2
8750295 Liron Jun 2014 B2
8774433 Goldstein Jul 2014 B2
8774435 Ambrose et al. Jul 2014 B2
8792669 Harsch Jul 2014 B2
8798278 Isabelle Aug 2014 B2
8798279 Ranta Aug 2014 B2
8798289 Every Aug 2014 B1
8804974 Melanson Aug 2014 B1
8848939 Keady et al. Sep 2014 B2
8851372 Zhou Oct 2014 B2
8855343 Usher Oct 2014 B2
8903113 Gebert Dec 2014 B2
8917880 Goldstein et al. Dec 2014 B2
8917892 Poe Dec 2014 B2
8917894 Goldstein Dec 2014 B2
8942370 Li Jan 2015 B2
8942405 Jones et al. Jan 2015 B2
8948428 Kates Feb 2015 B2
8983081 Bayley Mar 2015 B2
8992710 Keady Mar 2015 B2
9002023 Gauger Apr 2015 B2
9013351 Park Apr 2015 B2
9037458 Park et al. May 2015 B2
9053697 Park Jun 2015 B2
9076427 Alderson Jul 2015 B2
9112701 Sano Aug 2015 B2
9113240 Ramakrishman Aug 2015 B2
9113267 Usher et al. Aug 2015 B2
9123323 Keady Sep 2015 B2
9123343 Kurki-Suonio Sep 2015 B2
9129291 Goldstein Sep 2015 B2
9135797 Couper et al. Sep 2015 B2
9135809 Chang Sep 2015 B2
9137597 Usher Sep 2015 B2
9138353 Keady Sep 2015 B2
9142207 Hendrix Sep 2015 B2
9165567 Visser Oct 2015 B2
9185481 Goldstein et al. Nov 2015 B2
9191732 Wurtz Nov 2015 B2
9191740 McIntosh Nov 2015 B2
9196247 Harada Nov 2015 B2
9216237 Keady Dec 2015 B2
9288592 Basseas Mar 2016 B2
9338568 van Hal May 2016 B2
9357288 Goldstein May 2016 B2
9369814 Victorian Jun 2016 B2
9445183 Dahl Sep 2016 B2
9462100 Usher Oct 2016 B2
9491542 Usher Nov 2016 B2
9497423 Moberly Nov 2016 B2
9539147 Keady et al. Jan 2017 B2
9628896 Ichimura Apr 2017 B2
9653869 Hersman May 2017 B1
9684778 Tharappel Jun 2017 B2
9685921 Smith Jun 2017 B2
9757069 Keady et al. Sep 2017 B2
9763003 Usher Sep 2017 B2
9779716 Gadonniex Oct 2017 B2
9781530 Usher et al. Oct 2017 B2
9843854 Keady Dec 2017 B2
9894452 Termeulen Feb 2018 B1
9943185 Chen Apr 2018 B2
10012529 Goldstein et al. Jul 2018 B2
10045107 Kirsch et al. Aug 2018 B2
10142332 Ravindran Nov 2018 B2
10190904 Goldstein et al. Jan 2019 B2
10284939 Radin May 2019 B2
10297246 Asada May 2019 B2
10413197 LeBoeuf Sep 2019 B2
10506320 Lott Dec 2019 B1
10529325 Goldstein Jan 2020 B2
10709339 Lusted Jul 2020 B1
10760948 Goldstein Sep 2020 B2
10917711 Higgins Feb 2021 B2
10970375 Shila Apr 2021 B2
10997978 Goldstein May 2021 B2
11006198 Lott May 2021 B2
11012770 Hatfield et al. May 2021 B2
11051704 Tran Jul 2021 B1
11115750 Monsarrant-Chanon Sep 2021 B2
11122357 Burnett Sep 2021 B2
11172298 Carrigan Nov 2021 B2
11277700 Goldstein Mar 2022 B2
11294619 Usher et al. Apr 2022 B2
11383158 Bonanno Jul 2022 B2
11443746 Goldstein Sep 2022 B2
11610587 Goldstein Mar 2023 B2
20010041559 Salabaschew Nov 2001 A1
20010046304 Rast Nov 2001 A1
20020003889 Fischer Jan 2002 A1
20020009203 Erten Jan 2002 A1
20020026311 Okitsu Feb 2002 A1
20020057817 Darbut May 2002 A1
20020069056 Nofsinger Jun 2002 A1
20020076057 Voix Jun 2002 A1
20020076059 Joynes Jun 2002 A1
20020085690 Davidson et al. Jul 2002 A1
20020098878 Mooney Jul 2002 A1
20020106091 Furst et al. Aug 2002 A1
20020111798 Huang Aug 2002 A1
20020118798 Langhart et al. Aug 2002 A1
20020123893 Woodward Sep 2002 A1
20020133513 Townsend et al. Sep 2002 A1
20020141599 Trajkovic Oct 2002 A1
20020141602 Nemirovski Oct 2002 A1
20020143534 Hol Oct 2002 A1
20020165719 Wang Nov 2002 A1
20020169596 Brill et al. Nov 2002 A1
20020169615 Kruger et al. Nov 2002 A1
20020191799 Nordqvist Dec 2002 A1
20020191952 Fiore Dec 2002 A1
20020193130 Yang Dec 2002 A1
20030008633 Bartosik Jan 2003 A1
20030026438 Ray Feb 2003 A1
20030033152 Cameron Feb 2003 A1
20030035551 Light Feb 2003 A1
20030048882 Smith Mar 2003 A1
20030050777 Walker Mar 2003 A1
20030055627 Balan Mar 2003 A1
20030061032 Gonopolskiy Mar 2003 A1
20030065512 Walker Apr 2003 A1
20030065620 Gailey et al. Apr 2003 A1
20030069002 Hunter Apr 2003 A1
20030083879 Cyr et al. May 2003 A1
20030083883 Cyr et al. May 2003 A1
20030110040 Holland et al. Jun 2003 A1
20030130016 Matsuura Jul 2003 A1
20030152359 Kim Aug 2003 A1
20030156725 Boone Aug 2003 A1
20030161097 Le et al. Aug 2003 A1
20030165246 Kvaloy et al. Sep 2003 A1
20030165319 Barber Sep 2003 A1
20030198357 Schneider Oct 2003 A1
20030198359 Killion Oct 2003 A1
20030200096 Asai Oct 2003 A1
20030228019 Eichler Dec 2003 A1
20030228023 Burnett Dec 2003 A1
20040008850 Gustavsson Jan 2004 A1
20040019482 Holub Jan 2004 A1
20040042103 Mayer Mar 2004 A1
20040047474 Vries Mar 2004 A1
20040047486 Van Doorn Mar 2004 A1
20040049385 Lovance et al. Mar 2004 A1
20040086138 Kuth May 2004 A1
20040088162 He et al. May 2004 A1
20040109579 Izuchi Jun 2004 A1
20040109668 Stuckman Jun 2004 A1
20040125965 Alberth, Jr. et al. Jul 2004 A1
20040128136 Irani Jul 2004 A1
20040133421 Burnett Jul 2004 A1
20040150717 Page Aug 2004 A1
20040160573 Jannard Aug 2004 A1
20040165742 Shennib Aug 2004 A1
20040185804 Kanamori Sep 2004 A1
20040190737 Kuhnel et al. Sep 2004 A1
20040196992 Ryan Oct 2004 A1
20040202333 Csermak Oct 2004 A1
20040202339 O'Brien Oct 2004 A1
20040202340 Armstrong Oct 2004 A1
20040203351 Shearer et al. Oct 2004 A1
20040252852 Taenzer Dec 2004 A1
20040258263 Saxton et al. Dec 2004 A1
20040264938 Felder Dec 2004 A1
20050008167 Gleissner Jan 2005 A1
20050028212 Laronne Feb 2005 A1
20050033384 Sacha Feb 2005 A1
20050033571 Huang Feb 2005 A1
20050047611 Mao Mar 2005 A1
20050049854 Reding Mar 2005 A1
20050058300 Suzuki Mar 2005 A1
20050058313 Victorian Mar 2005 A1
20050060142 Visser Mar 2005 A1
20050068171 Kelliher Mar 2005 A1
20050070337 Byford Mar 2005 A1
20050071158 Byford Mar 2005 A1
20050071626 Bear Mar 2005 A1
20050078838 Simon Apr 2005 A1
20050078842 Vonlanthen Apr 2005 A1
20050090295 Ali Apr 2005 A1
20050096764 Weiser May 2005 A1
20050096899 Padhi et al. May 2005 A1
20050102142 Soufflet May 2005 A1
20050114124 Liu May 2005 A1
20050123146 Voix et al. Jun 2005 A1
20050134710 Nomura Jun 2005 A1
20050163289 Caspi et al. Jul 2005 A1
20050175194 Anderson Aug 2005 A1
20050182620 Kabi Aug 2005 A1
20050207605 Dehe Sep 2005 A1
20050215907 Toda Sep 2005 A1
20050216531 Blandford Sep 2005 A1
20050222820 Chung Oct 2005 A1
20050227674 Kopra Oct 2005 A1
20050254640 Ohki Nov 2005 A1
20050254676 Rass Nov 2005 A1
20050258942 Manasseh Nov 2005 A1
20050260978 Rader Nov 2005 A1
20050264425 Sato Dec 2005 A1
20050281422 Armstrong Dec 2005 A1
20050281423 Armstrong Dec 2005 A1
20050283369 Clauser et al. Dec 2005 A1
20050288057 Lai et al. Dec 2005 A1
20060013410 Wurtz Jan 2006 A1
20060053007 Niemisto Mar 2006 A1
20060064037 Shalon et al. Mar 2006 A1
20060067551 Cartwright et al. Mar 2006 A1
20060074895 Belknap Apr 2006 A1
20060083387 Emoto Apr 2006 A1
20060083388 Rothschild Apr 2006 A1
20060083390 Kaderavek Apr 2006 A1
20060083395 Allen et al. Apr 2006 A1
20060088176 Werner Apr 2006 A1
20060092043 Lagassey May 2006 A1
20060095199 Lagassey May 2006 A1
20060116877 Pickering Jun 2006 A1
20060126821 Sahashi Jun 2006 A1
20060126865 Blamey Jun 2006 A1
20060140425 Berg Jun 2006 A1
20060147063 Chen Jul 2006 A1
20060153394 Beasley Jul 2006 A1
20060154642 Scannell, Jr. Jul 2006 A1
20060167687 Kates Jul 2006 A1
20060173563 Borovitski Aug 2006 A1
20060182287 Schulein Aug 2006 A1
20060184983 Casey Aug 2006 A1
20060188075 Peterson Aug 2006 A1
20060188105 Baskerville Aug 2006 A1
20060195322 Broussard et al. Aug 2006 A1
20060204014 Isenberg et al. Sep 2006 A1
20060233413 Nam Oct 2006 A1
20060241948 Abrash Oct 2006 A1
20060258325 Tsutaichi Nov 2006 A1
20060262935 Goose Nov 2006 A1
20060262938 Gauger Nov 2006 A1
20060262944 Rasmussen et al. Nov 2006 A1
20060264176 Hong Nov 2006 A1
20060274166 Lee Dec 2006 A1
20060285709 Barthel Dec 2006 A1
20060287014 Matsuura Dec 2006 A1
20070003090 Anderson Jan 2007 A1
20070009122 Hamacher Jan 2007 A1
20070009127 Klemenz et al. Jan 2007 A1
20070014423 Darbut et al. Jan 2007 A1
20070019817 Siltmann Jan 2007 A1
20070021958 Visser et al. Jan 2007 A1
20070036377 Stirnemann Feb 2007 A1
20070043563 Comerford et al. Feb 2007 A1
20070076896 Hosaka Apr 2007 A1
20070086600 Boesen Apr 2007 A1
20070092087 Bothra Apr 2007 A1
20070100637 McCune May 2007 A1
20070143820 Pawlowski Jun 2007 A1
20070147635 Dijkstra Jun 2007 A1
20070160243 Dijkstra Jul 2007 A1
20070185601 Lee Aug 2007 A1
20070189544 Rosenberg Aug 2007 A1
20070194893 Deyoe Aug 2007 A1
20070206825 Thomasson Sep 2007 A1
20070223717 Boersma Sep 2007 A1
20070225035 Gauger Sep 2007 A1
20070230734 Beard Oct 2007 A1
20070233487 Cohen Oct 2007 A1
20070239294 Brueckner Oct 2007 A1
20070253569 Bose Nov 2007 A1
20070255435 Cohen Nov 2007 A1
20070260460 Hyatt Nov 2007 A1
20070274531 Camp Nov 2007 A1
20070291953 Ngia et al. Dec 2007 A1
20080037801 Alves et al. Feb 2008 A1
20080063228 Mejia Mar 2008 A1
20080069369 Dyer Mar 2008 A1
20080091421 Gustavsson Apr 2008 A1
20080101638 Ziller May 2008 A1
20080107282 Asada May 2008 A1
20080123866 Rule May 2008 A1
20080130908 Cohen Jun 2008 A1
20080137873 Goldstein Jun 2008 A1
20080145032 Lindroos Jun 2008 A1
20080152167 Taenzer Jun 2008 A1
20080152169 Asada Jun 2008 A1
20080159547 Schuler Jul 2008 A1
20080162133 Couper Jul 2008 A1
20080165988 Terlizzi et al. Jul 2008 A1
20080175411 Greve Jul 2008 A1
20080181419 Goldstein Jul 2008 A1
20080201138 Visser Aug 2008 A1
20080221880 Cerra et al. Sep 2008 A1
20080240458 Goldstein Oct 2008 A1
20080257047 Pelecanos Oct 2008 A1
20080260180 Goldstein Oct 2008 A1
20080269926 Xiang Oct 2008 A1
20090010456 Goldstein et al. Jan 2009 A1
20090016501 May Jan 2009 A1
20090016541 Goldstein Jan 2009 A1
20090024234 Archibald Jan 2009 A1
20090034748 Sibbald Feb 2009 A1
20090046867 Clemow Feb 2009 A1
20090067661 Keady Mar 2009 A1
20090071487 Keady Mar 2009 A1
20090076821 Brenner Mar 2009 A1
20090085873 Betts Apr 2009 A1
20090087003 Zurek May 2009 A1
20090122996 Klein May 2009 A1
20090286515 Othmer May 2009 A1
20090175474 Salvetti Jul 2009 A1
20090180631 Michael Jul 2009 A1
20090192688 Padmanabhan Jul 2009 A1
20090227888 Salmi Sep 2009 A1
20090238386 Usher Sep 2009 A1
20090274314 Arndt Nov 2009 A1
20100061564 Clemow et al. Mar 2010 A1
20100119077 Platz May 2010 A1
20100150367 Mizuno Jun 2010 A1
20100166203 Peissig Jul 2010 A1
20100223223 Sandler Sep 2010 A1
20100241256 Goldstein et al. Sep 2010 A1
20100296668 Lee et al. Nov 2010 A1
20100316033 Atwal Dec 2010 A1
20100328224 Kerr et al. Dec 2010 A1
20110055256 Phillips Mar 2011 A1
20110079227 Turncot et al. Apr 2011 A1
20110096939 Ichimura Apr 2011 A1
20110116643 Tiscareno May 2011 A1
20110125063 Shalon May 2011 A1
20110135120 Larsen Jun 2011 A1
20110187640 Jacobsen et al. Aug 2011 A1
20110264447 Visser et al. Oct 2011 A1
20110288860 Schevciw Nov 2011 A1
20110293103 Park et al. Dec 2011 A1
20120076317 Fratti Mar 2012 A1
20120170412 Calhoun Jul 2012 A1
20130098706 Keady Apr 2013 A1
20130136285 Naumann May 2013 A1
20130149192 Keady Jun 2013 A1
20130219345 Saukko Aug 2013 A1
20130251172 Mosseri Sep 2013 A1
20140003644 Keady et al. Jan 2014 A1
20140010378 Voix Jan 2014 A1
20140023203 Rotschild Jan 2014 A1
20140026665 Keady Jan 2014 A1
20140089672 Luna Mar 2014 A1
20140122092 Goldstein May 2014 A1
20140148101 Seshadri May 2014 A1
20140163976 Park Jun 2014 A1
20140205123 Lafort et al. Jul 2014 A1
20140373854 Keady Dec 2014 A1
20150150728 Duvall Jun 2015 A1
20150215701 Usher Jul 2015 A1
20150358730 Kirsch Dec 2015 A1
20160012714 Patenaude Jan 2016 A1
20160015568 Keady Jan 2016 A1
20160050483 Inc. Feb 2016 A1
20160058378 Wisby et al. Mar 2016 A1
20160104452 Guan et al. Apr 2016 A1
20160127818 Ambrose May 2016 A1
20160192077 Keady Jun 2016 A1
20160277854 Puria Sep 2016 A1
20160295311 Keady et al. Oct 2016 A1
20170134865 Goldstein et al. May 2017 A1
20170142511 Dennis May 2017 A1
20170223451 Kirsch Aug 2017 A1
20180054668 Keady Feb 2018 A1
20180132048 Usher et al. May 2018 A1
20180220239 Keady et al. Aug 2018 A1
20180233125 Mitchell Aug 2018 A1
20190038224 Zhang Feb 2019 A1
20190082272 Goldstein et al. Mar 2019 A9
20190387305 Keady Dec 2019 A1
20210014597 Andersen Jan 2021 A1
20210152924 Keady May 2021 A1
20220061767 Goldstein et al. Mar 2022 A1
Foreign Referenced Citations (113)
Number Date Country
2006200446 Feb 2006 AU
2215764 Nov 1996 CA
4312155 Oct 1994 DE
102012221233 Mar 2014 DE
102013203334 May 2014 DE
0643881 Dec 1998 EP
0935236 Aug 1999 EP
1415505 Dec 2002 EP
1033063 May 2003 EP
1320281 Jun 2003 EP
0692169 Jul 2003 EP
1483591 Nov 2003 EP
1385324 Jan 2004 EP
1401240 Mar 2004 EP
1570244 Jun 2004 EP
1489596 Dec 2004 EP
1519625 Mar 2005 EP
1594344 Sep 2005 EP
1638079 Mar 2006 EP
1640972 Mar 2006 EP
1640972 Mar 2006 EP
1674061 Jun 2006 EP
1681903 Jul 2006 EP
1800950 Jun 2007 EP
1841283 Oct 2007 EP
2749043 Jul 2014 EP
2991381 Apr 2019 EP
3068142 Sep 2019 EP
2560520 Sep 1985 FR
1518299 Jul 1978 GB
2082820 Aug 1980 GB
2441835 Aug 2008 GB
5145623 Jun 1993 JP
H05199590 Aug 1993 JP
H05336599 Dec 1993 JP
H0877468 Mar 1996 JP
H10162283 Jun 1998 JP
H10294989 Nov 1998 JP
297362 Sep 1999 JP
12878298 Nov 1999 JP
H11331990 Nov 1999 JP
3085237 Jul 2000 JP
2001045585 Feb 2001 JP
2001054184 Feb 2001 JP
3353701 Dec 2002 JP
3353701 Dec 2002 JP
2003304599 Oct 2003 JP
3556987 May 2004 JP
2005064744 Mar 2005 JP
2005130205 May 2005 JP
2005168888 Jun 2005 JP
2005227511 Aug 2005 JP
2005260944 Sep 2005 JP
2005295175 Oct 2005 JP
2006107044 Apr 2006 JP
2004289762 Feb 2007 JP
2009003040 Jan 2009 JP
2017147677 Aug 2017 JP
20020086433 Nov 2002 KR
100366231 Dec 2002 KR
20030013732 Feb 2003 KR
20030058432 Jul 2003 KR
20030068021 Aug 2003 KR
20030069471 Aug 2003 KR
101154948 Jul 2006 KR
100607492 Aug 2006 KR
100783099 Dec 2007 KR
101194923 Oct 2012 KR
200615862 May 2006 TW
WO1986000133 Jan 1986 WO
WO9326085 Dec 1993 WO
WO1993026085 Dec 1993 WO
WO1997025790 Jul 1997 WO
WO1998054878 Dec 1998 WO
WO1999043185 Aug 1999 WO
WO2001001731 Jan 2001 WO
WO2001057852 Aug 2001 WO
WO2002013522 Feb 2002 WO
WO2002017836 Mar 2002 WO
WO2002093891 Nov 2002 WO
WO2002101720 Dec 2002 WO
WO2003023766 Mar 2003 WO
WO2003073790 Sep 2003 WO
WO2004016037 Feb 2004 WO
WO2006026812 Mar 2004 WO
WO2007028250 Mar 2004 WO
WO2004114722 Dec 2004 WO
WO2005029468 Mar 2005 WO
WO2005073875 Aug 2005 WO
WO2005107320 Nov 2005 WO
WO2006034029 Mar 2006 WO
2006037156 Apr 2006 WO
WO2006037156 Apr 2006 WO
WO2006054205 May 2006 WO
WO2006054698 May 2006 WO
WO2006074082 Jul 2006 WO
WO2006114101 Nov 2006 WO
WO2007007916 Jan 2007 WO
WO2007017809 Feb 2007 WO
WO2007017810 Feb 2007 WO
WO2007073818 Jul 2007 WO
WO2007082579 Jul 2007 WO
WO2007092660 Aug 2007 WO
WO2007147077 Dec 2007 WO
WO2008017326 Feb 2008 WO
WO2008050583 May 2008 WO
WO2008096125 Aug 2008 WO
WO2009023633 Feb 2009 WO
WO2009023784 Feb 2009 WO
WO2009097009 Aug 2009 WO
WO2011110901 Sep 2011 WO
WO2011161487 Dec 2011 WO
WO2012097150 Jul 2012 WO
Non-Patent Literature Citations (565)
Entry
Olwal, A. and Feiner S. Interaction Techniques Using Prosodic Features of Speech and Audio Localization. Proceedings of IUI 2005 (International Conference on Intelligent User Interfaces), San Diego, CA, Jan. 9-12, 2005, p. 284-286.
Bernard Widrow, John R. Glover Jr., John M. McCool, John Kaunitz, Charles S. Williams, Robert H. Hearn, James R. Zeidler, Eugene Dong Jr, and Robert C. Goodlin, Adaptive Noise Cancelling: Principles and Applications, Proceedings of the IEEE, vol. 63, No. 12, Dec. 1975.
Mauro Dentino, John M. McCool, and Bernard Widrow, Adaptive Filtering in the Frequency Domain, Proceedings of the IEEE, vol. 66, No. 12, Dec. 1978.
Samsung Electronics Co., Ltd., Samsung Electronics, America, Inc., and Harman International Industries, Inc.v. Staton Techiya, LLC, IPR2024-00559, Feb. 9, 2024.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00282, Dec. 21, 2021.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00242, Dec. 23, 2021.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00243, Dec. 23, 2021.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00234, Dec. 21, 2021.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00253, Jan. 18, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00324, Jan. 13, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00281, Jan. 18, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00302, Jan. 13, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00369, Feb. 18, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00388, Feb. 18, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00410, Feb. 18, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-01078, Jun. 9, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-01099, Jun. 9, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-01106, Jun. 9, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-01098, Jun. 9, 2022.
Samsung Electronics Co., Ltd., And Samsung Electronics, America, Inc., v. Staton Techiya, LLC, IPR2022-00559, Feb. 9, 2024.
U.S. Appl. No. 90/015,146, Samsung Electronics Co., Ltd. and Samsung Electronics, America, Inc., Request For Ex Parte Reexamination Of U.S. Pat. No. 10,979,836.
U.S. Appl. No. 90/019,169, Samsung Electronics Co., Ltd. and Samsung Electronics, America, Inc., Request For Ex Parte Reexamination Of U.S. Pat. No. 11,244,666.
Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 1A-1C for Patent No. 8,111,839 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 2A-2C for Patent No. 8,254,591 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 3A-3C for Patent No. 8,315,400 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 4A-4C for Patent No. 9,124,982 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 5A-5C for Patent No. 9,270,244 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 6A-6C for Patent No. 9,491,542 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 7A-7C for Patent No. 9,609,424 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 8A-8C for Patent No. 10,405,082 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 9A-9Cfor Patent No. 8,111,839 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 10A-10C for Patent No. 10,979,836 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 11A-11C for Patent No. 11,039,259 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 12A-12C for Patent No. 11,057,701 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 13A-13C for Patent No. 11,217,237 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Appendix 14A-14C for Patent No. 11,244,666 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A1 (Nacre QuietPro) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A2 (Silynx QuietOps) to Samsung's Invalidity Contentions and P.R. 3-3 And 3- 4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A3 (Motorola H5) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A4 (Jawbone Aliph) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A5 (Snooper) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A6 (NCH Swift) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A11 (NaturalRecorder) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A20 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A21 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A22 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A23 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A24 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A25 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A26 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A27 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A28 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A29 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A30 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A31 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A32 (Olympus WS-320M) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A33 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A34 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. A35 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. B19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. C15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D20 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D21 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D22 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D23 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D24 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D25 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D26 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D27 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D28 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D29 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D30 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D31 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. D32 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. El to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for US Patent Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E9 (corrected) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. E18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F4 (corrected) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F5 (corrected) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F9 (corrected) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. F18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G20 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G21 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G22 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G23 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G24 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G25 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G26 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G27 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G28 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G29 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G30 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G31 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G32 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G33 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G34 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. G35 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. H20 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. Il to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 110 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. I11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 112 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 115 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. 116 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. I17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. I18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J1 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J2 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J3 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J4 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J5 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J6 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J7 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J8 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J9 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J10 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J11 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J12 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J13 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J14 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J15 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J16 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J17 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J18 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J19 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J20 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J21 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J22 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J23 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J24 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J25 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J26 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J27 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J28 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J29 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Ex. J30 to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 8,111,839, 8,254,591, 8,315,400, 9,124,982, 9,270,244, 9,491,542, 9,609,424, 10,405,082, 10,966,015 (Case No. 2:22-CV-00053-JRG-RSP), served May 18, 2022.
Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K1 (Calhoun) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K2 (Cerra) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K3 (Chen '353) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K4 (Comerford) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K5 (Couper) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K6 (Emoto) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K7 (Zaykovskiy) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K8 (Hunter) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K9 (Jones) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K10 (Kelliher) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K11 (Kopra) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K12 (Lagassey '043) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K13 (Lemelson) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K14 (Pickering) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K15 (Schuler) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K16 (Soufflet) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K17 (White) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K18 (BlueAnt V1) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K19 (LG Chocolate) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K20 (Midomi) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K21 (Promptu) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K22 (Samsung SCH-a950) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K23 (W850) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K24 (Ears) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K25 (Motorola Pebl) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K26 (Silynx QuietOps) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K27 (Nacre QuietPro) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K28 (Shazam) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K29 (Vlingo) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit K30 (Yoon) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L1 (Alves 801) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L2 (Burnett 421) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L3 (Hietanen) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L4 (Huang 798) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L5 (Jaber) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L6 (LG HBM-730) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L7 (Nokia BH-600) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L8 (Nokia BH-900) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L9 (Pedersen) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L10 (QuietOps) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L11 (QuietPro) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L12 (Visser '958) to Samsung's Invalidity Contentions and P.R. 3-3 And 3- 4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L13 (Zhang 099) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L14 (Byford) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L15 (Mejia '156) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit L16 (Yang '130) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M1 (Armstrong) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M2 (Boersma) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M3 (Dijsktra 972) to Samsung's Invalidity Contentions and P.R. 3-3 And 3- 4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M4 (Hamacher 031) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M5 (Hietanen) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M6 (Hotvet) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M7 (Kondo 701) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M8 (Kvaløy) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M9 (Light) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M10 (Melanson) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M11 (Nemirovski 368) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M12 (Platz 077) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M13 (Rasmussen 245) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M14 (Svean 359) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M15 (Victorian 625) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M16 (Zurek 379) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M17 (Jawbone) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M18 (QuietOps) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M19 (Nacre QuietPro) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M20 (SenSay) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M21 (Andrea) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M22 (Darbut) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit M23 (Ramakrishnan) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N1 (Platz 077) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N2 (Kvaløy) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N3 (Inanaga) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N4 (Rosenberg) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N5 (Visser 958) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N6 (Terlizzi) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N7 (Light) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N8 (Boersma) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N9 (McCune) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N10 (Bose) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N11 (Emoto) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N12 (Dijkstra 243) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N13 (Cohen 908) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N14 (Rast) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N15 (Bothra 629) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N16 (Victorian 625) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N17 (Engle) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N18 (Svean 359) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N19 (Hotvet) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N20 (Killion 056) to Samsung's Invalidity Contentions and P.R. 3-3 And 3- 4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N21 (Bothra 087) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N22 (Melanson) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N23 (Andrea) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N24 (Hohman) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N25 (Bergeron) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N26 (Frank) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N27 (Darbut 423) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N28 (QuietPro) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N29 (QuietOps) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N30 (Jawbone) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N31 (EarSet 2) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N32 (Etymotic ER-6) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N33 (Zen) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N34 (Motorola H605) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N35 (Peltor Lite-Com II) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N36 (Discovery 655) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N37 (MX200 Series) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N38 (Sony S700) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N39 (H5 Miniblue) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N40 (3D Active Ambient IEM) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N41 (Armstrong 422) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N42 (Hohn) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N43 (Mejia 228) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N44 (Nemirovski 368) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N45 (Thomasson) to Samsung's Invalidity Contentions and P.R. 3-3 And 3- 4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N46 (Zurek 003) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N47 (Kurcan) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N48 (Rafaely) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N49 (Vaidyanathan) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N50 (Westerlund) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
Exhibit N51 (Zhang) to Samsung's Invalidity Contentions and P.R. 3-3 And 3-4 Disclosures for U.S. Pat. Nos. 11,039,259, 11,057,701, 11,217,237, and 11,244,666 (Case No. 2:22-CV-00053-JRG-RSP), served Jul. 6, 2022.
3M/Aearo Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Sep. 10, 2015 WayBack Machine capture of 3M's website depicts a brochure describing the E-A-RFit and “Individual Fit Testing Using F-Mire.” https://web.archive.org/web/20150910084252/http:/multimedia.3m.com/mws/media/670/earfit-dual-ear-brochure-us.pdf?fn-EARfit%20Dual- Ear%20Brochure%20US.pdf (SAM-TECH_000523336—SAM-TECH_00052336; SAM-TECH_00052339—SAM-TECH_00052339).
3M/Aearo Technologies' E-A-RFitTM Validation System (“E-A-RFit”), 2010 brochure from 3M's website describes the E-A-RFit and identifies model 393-1000 as an available mode. https://multimedia.3m.com/mws/media/62914 9O/3m-e-a-rfit-validation-system-brochure.pdf (SAM-TECH_00052186):.
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Abstract titled “New from ISEA member 3M Company (www.3m.com] is the E-A-Rfit Validation System a quantitative hearing protector fittest”, published in Jul. 2012. New from ISEA member 3M Company (www.3m.com] is the E-A-Rfit Validation System a quantitative hearing protector fittest, EHS today, vol. 5, Issue 7, ISSN 1945-9599, Gale Group Trade & Industry Database (Jul. 2012), available at https://dialog.proquest.com/professional/docview/1095272736?accountid=154502 (SAM-TECH_00052203);.
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Apr. 24, 2007 article published by E.H. Berger from Aearo Technologies discusses E-A-RFit and notes that “[t]he E-A-RFitTM Validation System is a quick and accurate method of estimating real-ear attenuation for a given fitting of a pair of earplugs” and “has been designed and built to be an integral part of a comprehensive workplace hearing conservation program.” See E.H. Berger, Recommended Applications for the E-A-RFitTM Validation System in a Workplace Hearing Conservation Program, Aearo Company (2007) (SAM-TECH_00056087—SAM-TECH_0005609).
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), At least by Feb. 9, 2007, A 2010 brochure for the E-A-RFit describes the validation system and lists Model 393-1000 as an available product. https://multimedia.3m.com/mws/media/67382 80/earfit-brochure.pdf (SAM-TECH_00052179—SAM-TECH_00052184);.
Methods Of Developing And Validating A Field—MIRE Approach For Measuring Hearing Protector Attenuation, Berger, Elliott & Voix, Jérémie & Kieper, R., Feb. 9, 2007, in connection with 3M/Aearo Technologies' E-A-RFitTM Validation System (“E-A-RFit”); This article was originally prepared for the 32nd Annual Conference of the National Hearing Conservation Association, held on Feb. 15-17, 2007, in Savannah, Georgia, and published in Spectrum, vol. 24, Suppl. 1.
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Mar. 16, 2016 WayBack Machine capture of 3M's website lists the E-A-RFit for purchase. https://web.archive.org/web/20160316180537/ http://www.3m.com/3M/en_US/company-us/all-3m-products/˜/All-3M- Products/Personal-Protective-Equipment/Hearing-Protection/Safety/Worker-Health-Safety/E-A-R-Fit-Validation-Tools/?N=5002385+8709322+8711017+8711405+8720539+8720546+8720770+329 4857497&rt=r3 (SAM-TECH_00052201);.
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Mar. 20, 2016 WayBack Machine capture of 3M's website describes the Validation System and protection that the system offers. https://web.archive.org/web/20160320080156/ http:/www.3m.com/3M/en_US/company-us/all-3m-products/˜/All-3M- Products/Personal-Protective-Equipment/Hearing-Protection/Safety/Worker-Health-Safety/?N=5002385+8709322+8711017+871 1405+8720539+8720546+3294857497&rt=r3 (SAM-TECH_00052278; Sam- TECH_00052292);.
3M/Acaro Technologies' E-A-RFitTM Validation System (“E-A-RFit”), Sep. 4, 2015 WayBack Machine capture of 3M's website contains an image of the E- A- RFit and states “[t]he 3MTM E-A-Rfit™M Dual Ear Validation System makes it easy to measure every employee's unique level of protection and takes the guesswork out of managing compliance in your hearing conservation program.” https://web.archive.org/web/20150904132810/ http:/solutions.3m.com/wps/portal/3M/en_US/3M-PPE-Safety-Solutions/Personal- Protective-Equipment/safety-management/safety-training/hearing- protection-fit- testing/?WT.mc_id=www.3m.com/EARfitDe mo/ (SAM-TECH_00052276; Sam-Tech _; SAM-TECH_00052274);.
A binaural processor for missing data speech recognition in the presence of noise and small-room reverberation, Kalle Palomäki, Guy Brown & Deliang Wang, Speech Communication, 43, 361-378.
A compact multi-sensor headset for hands-free communication, Liu, Zicheng & Seltzer, Michael & Acero, A. & Tashev, Ivan & Zhang, Zhengyou & Sinclair, Michael, IEEE Workshop on Applications of Signal Processing to Audio and Acoustics, 138-141. 10.1109/ASPAA.2005.1540188.
A Dual—Mode Human—Machine Interface for Robotic Control Based on Acoustic Sensitivity of the Aural Cavity—Ravi Vaidyanathan, et al., Feb. 2006.
A Local Active Noise Control System for Locomotive Drivers, internoise 2000, the 29th International Congress and Exhibition on Noise Control Enginerring, Nielsen, Saebo, Ottesen, Reinen, Sorsdal, Aug. 2000.
A MFCC-based CELP speech coder for server-based speech recognition in network environments, Yoon, Jae Sam, Gil Ho Lee, and Hong Kook Kim, IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences 90.3, 626-632, Mar. 2007.
A Modified Coherence Based Method for Dual Microphone Speech Enhancement, M. Rahmani, et al., Signal Processing and Communications, 2007.
A New Two—Sensor Active Noise Cancellation Algorithm, K.C. Zangi, 1993 IEEE International Conference on Acoustics, Speech, and Signal Processing, Minneapolis, MN, USA, 1993, pp. 351-354 vol. 2, doi: 10.
A Pattern Recognition Approach to Voiced-Unvoiced-Silence Classification with Applications to Speech Recognition, B. Atal and L. Rabiner, IEEE Transactions on Acoustics, Speech, and Signal Processing, vol. 24, No. 3, pp. 201-212, Jun. 1976.
Active Noise Attenuation Using LQG/LTR Control, Garcia, José & Bortoloto, Edson & Ribeiro, Jean & Garcia, Eletrônica de Potência. 9. 23-27, Eletrônica de Potência. 9. 23-27. 10.18618/REP.2005.2.023027, Nov. 2004.
Active Noise Cancellation for Headphones Used in High Noise Environments Using Conventional Analog Circuitry, Mark C. Flohr, May 1, 1987.
Active Noise Control System for Headphone Applications Sen M. Kuo, et al. 2006.
Active Noise Control: Low—Frequency Techniques for Suppressing Acoustic Noise Leap Forward with Signal Processing, S.J. Elliott and P.A. Nelson, Oct. 1993.
Active Noise Reduction Headphone Measurement: Comparison Of Physical And Psychophysical Protocols And Effects Of Microphone Placement, PERALA, Apr. 10, 2006.
Active noise Reduction in an ear terminal, OTTESEN, The Journal of the Acoustical Society of America, vol. 105, Issue 2, Feb. 1999.
Adaptive Feedback Active Noise Control Headset: Implementation, Evaluation, and its Extensions, Woon S. Gan, et al. 2005.
Adaptive Filtering in the Frequency Domain, M. Dentino, J. McCool & B. Widrow, Proceedings of the IEEE, vol. 66, No. 12, pp. 1658-1659, Dec. 1978.
Adaptive Noise Cancellation in a Multimicrophone System for Distortion Product.
Otoacoustic Emission Acquisition, Rafael E. Delgado, et al., 2000 Adaptive Noise Cancelling In Headsets, Per Rubak, Henrik D. Green & Lars G. Johansen, Proceedings of IEEE Nordic Signal Processing Symposium, NORSIG'96, Sep. 24-27, 1996, Espoo, Finland.
Adaptive noise cancelling: Principles and applications, B. Widrow, et al., Proceedings of the IEEE, vol. 63, No. 12, pp. 1692-1716, Dec. 1975.
Air- and Bone-Conductive Integrated Microphones for Robust Speech Detection and Enhancement, Yanli Zheng, et al., 2003 IEEE Workshop on Automatic Speech Recognition and Understanding.
An Integrated Audio And Active Noise Control Headsets, W. S. Gan & S. M. Kuo, IEEE Transactions on Consumer Electronics, vol. 48, No. 2, pp. 242-247, May 2002.
Apple's AirPods Pro (“AirPods Pro”), Article on Apple's website published Apr. 15, 2020 mentions the AirPods Pro has an Ear Tip Fit test available. https://support.apple.com/en-us/HT210633 (SAM-TECH_00072120-SAM-TECH_00072123).
Apple's AirPods Pro (“AirPods Pro”), article published by Dan Seifert on Mar. 29, 2019 reviews the AirPods Pro and states Apple is selling the second-gen AirPods in two ways: with the new wireless charging case for $199 or with the standard case for the same $159 as before. https://www.theverge.com/2019/3/29/182860 12/apple- airpods-2-new-2nd-gen-review- price-specs-features; (SAM-TECH_00057262-SAM-TECH_00057275).
Apple's AirPods Pro (“AirPods Pro”), Dec. 21, 2019 WayBack Machine capture of Apple's website displays an image of the AirPods Pro and states “[u]se the Ear Tip Fit Test to create the optimal listening experience - you'll get the right tip size for your ears, and the best seal for noise cancellation.” https://web.archive.org/web/20191221170719/ https://www.apple.com/airpods-pro/ (SAM-TECH_00054447; SAM-TECH_00054134).
Apple's AirPods Pro (“AirPods Pro”), Nov. 2, 2019 article by Karisa Bell published on mashable.com discusses whether Apple's AirPods Pro are compatible with Androids. https://mashable.com/article/do-airpods-pro-work-with-android. (SAM-TECH_00052378-SAM-TECH_00052390).
Nov. 3, 2019 article published by Imran Hussain discusses how to use the ear tip fit test with the AirPods Pro and an iOS device such as the iphone 11 for the best fit. https://www.esquire.com/lifestyle/a29612084/apple-airpods-pro-active-noise- cancellation-review/ (SAM-TECH_00052413-SAM-TECH_00052424).
Oct. 20, 2019 article by Tim Hardwick discussing how to perform an ear tip fit test using Apple's AirPods Pro with Apple's iPhone 11.
https://www.macrumors.com/how-to/perform-ear-tip-fit-test-airpods-pro/; (SAM-TECH_00052357—Sam-TECH_00052370);.
Apple's AirPods Pro (“AirPods Pro”), Oct. 29, 2019 artcile by Sarah Rense also discusses testing out the AirPods Pro with active-noise cancellation. https://www.esquire.com/lifestyle/a29612084/apple-airpods-pro-active-noise- cancellation-review/ (SAM-TECH_00058067-SAM-TECH_00058080);.
Apple's AirPods Pro (“AirPods Pro”), The specs of the AirPods Pro can be found here https://web.archive.org/web/20191224065355/ https://www.apple.com/airpods-pro/specs/ (SAM-TECH_00052343—Sam-TECH_00052352; Sam-TECH_00053159);.
Apple's iPhone 11 (iPhone 11), Oct. 11, 2019 article published by Jake Peterson discusses the eartip fit test using AirPods Pro and an iPhone running iOS 13.2. https://ios.gadgethacks.com/how-to/make-your-airpods-pro-fit-better-by-testing-rubber-tips-0210500/ (Sam- TECH_00056564—SAM-TECH_00056569); Oct. 20, 2019 article by Tim Hardwick discussing how to perform an ear tip fit test using Apple's AirPods Pro with Apple's iPhone 11. https://www.macrumors.com/how- to/perform-ear-tip-fit-test-airpods-pro/; (SAM-TECH_00052357—SAM-TECH_00052370);.
Apple's iPhone 11 (iPhone 11), Press release from Apple's website dated Sep. 10, 2019 states “Apple introduces dual camera iPhone 11 ” and that “Customers in the US, Puerto Rico, the US Virgin Islands and more than 30 other countries and regions will be able to pre-order iPhone 11 beginning at 5 a.m. PDT on Friday, Sep. 13 with availability beginning Friday, Sep. 20.” https://www.apple.com/newsroom/2019/09/a pple-introduces-dual-camera-iphone-11/ (SAM-TECH_00056571—SAM-TECH_00056588);.
Apple's iPhone 11 (iPhone 11), Sep. 15, 2019 WayBack Machine capture of Apple's website has an image of the iphone 11 and lists it for sale on the website. https://web.archive.org/web/20190915061032 /https://www.apple.com/shop/buy-iphone/iphone-11; (SAM-TECH_00055106—SAM-TECH_00055123);.
Apple's iPhone 11 (iPhone 11), WayBack Machine capture from Sep. 16, 2019 of Apple's website, displays the iPhone and states Available 9.20. https://web.archive.org/web/20190916102733/ https://www.apple.com/iphone-11/specs/. (SAM-TECH_00056907).
Audiometric Ear Canal Probe with Active Ambient Noise Control, B. Rafaely & M. Furst, IEEE Transactions on Speech and Audio Processing, vol. 4, No. 3, pp. 224-230, May 1996.
Bang and Olufsen EarSet 2 Bluetooth Headset, At least by 2006, https://www.beoworld.org/prod_details.asp?pid=733 (SAM-TECH_00094798).
Bang and Olufsen EarSet 2 Bluetooth Headset, At least by 2006, https://www.dexigner.com/news/9935 (SAM-TECH_00094865).
Brian Hobbs et al., Wideband Hearing, Intelligibility, and Sound Protection, Jan. 10, 2008 Final Report AFRL-RH-WP-TR-2009-0031 at 2 (SAM-TECH_00053002-116).
Build These Noise-Cancelling Headphones, Jules Ryckebusch, 1997.
Combined feedback-feedforward active noise-reducing headset—The effect of the acoustics on broadband performance, Boaz Rafaely & Matthew Jones, J. Acoust. Soc. Am. Sep. 1, 2002; 112 (3): 981-989.
Dec. 25, 2005 WayBack Machine Capture of Maico's website has an image of the Maico MI26 and discusses the products features. https://web.archive.org/web/20051225200404/http:/www.maico-diagnostics.com/eprise/main/Maico/Products/ Files/MI26/SpecSheet.MI24-26.NEW.pdf (SAM-TECH_00051161—SAM-TECH_00051162).
Direct filtering for air- and bone-conductive microphones, Zicheng Liu, Zhengyou Zhang, A. Acero, J. Droppo and Xuedong Huang , IEEE 6th Workshop on Multimedia Signal Processing, 2004., Siena, Italy, 2004, pp. 363-366.
DSP Software Development Techniques for Embedded and Real-Time Systems, Robert Oshana, 2006.
E-3 In-Flight Acoustic Exposure Studies and Mitigation Via Active Noise Reduction Headset, Frank Mobley, John Allen Hall, & Donald Yeager, Dec. 2002.
Efficient Tracking of the Cross-Correlation Coefficient, AARTS, IEEE Transactions on Speech and Audio Processing, vol. 10, No. 6, Sep. 2002.
Etymotic ER-6 Earphones, At least by Feb. 7, 2005, https://www.cnet.com/reviews/etymoti c-er-6-review/ (SAM-TECH_00095121).
Etymotic ER-6 Earphones, At least by Feb. 7, 2005, https://www.etymotic.com/ephp/er6i-ts.aspx (SAM-TECH_00095178).
Etymotic's ER-33 Occlusion Effect Meter (“ER-33”), Apr. 9, 2001 WayBack Machine capture of Etymotic's website contains an image of the ER-33 and states that “[t]he ER-33 Occlusion Effect Meter quickly quantifies the occlusion effect and earmold leakage” and was on sale for $350.00. https://web.archive.org/web/20010404224259/ https://www.etymotic.com/ (SAM-TECH_00054976);.
Etymotic's ER-33 Occlusion Effect Meter (“ER-33”), Aug. 2003 article by H. Gustav Mueller in the Hearing Journal, Mueller describes the ER-33 as a product Manufactured by Etymotic that “costs No. more than a few bottles of good wine.” See H. Gustav Mueller, There's less talking in barrels, but the occlusion effect is still with US, 56 Hearing J. 10, 14 (2003) (SAM-TECH_00054761—SAM-TECH_00054764).
Etymotic's ER-33 Occlusion Effect Meter (“ER-33”), Dec. 5, 2004 article submitted by Wayne J. Staab to The Hearing Review, discusses the ER-33 and notes “[t]he occlusion effect was measured with the ER-33 Occlusion Effect meter (Figure 5) using a probe tube extending 2 mm beyond the receiver tip. The ER-33 is a hand-held device that measures both the magnitude of the occlusion effect and the leakage around an earmold.” https://hearingreview.com/practice-building/practice-management/measuring- the-occlusion-effect-in-a-deep-fitting-hearing-device (SAM-TECH_00060339—SAM-TECH_00060350);.
Etymotic's ER-33 Occlusion Effect Meter (“ER-33”), Mar. 3, 2005 capture of Etymotic's website contains a description of the ER-33 which includes a sale price for $350.00. https://web.archive.org/web/20050303170952/ http://www.etymotic.com/pro/er33.asp (SAM-TECH_00054986);.
Etymotic's ER-33 Occlusion Effect Meter (“ER-33”), Mar. 4, 2005 WayBack Machine capture of Etymotic's website contains a user manual for the ER-33 which was on sale at that time. https://web.archive.org/web/20050304030715/ http://www.etymotic.com/pdf/er33-oem-usermanual.pdf (SAM-TECH_00055001; SAM-TECH_00060165).
Excerpts from Discrete-Time Signal Processing, Third Edition, Alan V. Oppenheim & Ronald W. Schafer, Aug. 18, 2009.
Experimentation To Address Appropriate Test Techniques For Measuring The Attenuation Provided By Double ANR Hearing Protectors, Susan E. Mercy, Christopher Tubb and Soo H. James, New Directions for Improving Audio Effectiveness (pp. 18-1-18-14). Meeting Proceedings RTO-MP-HFM-123, Paper 18. Neuilly-sur-seine, France: RTO.
Fit-Testing of Hearing Protection, WITT, The Hearing Review.
Gennum Zen Digital Wireless Headset (“Zen”), At least by 2004, CNET Article—Gennum Zen Bluetooth Headset Review (SAM-TECH_00098419).
Gennum Zen Digital Wireless Headset (“Zen”), At least by 2004, Gennum Zen User Manual (SAM-TECH_00098432).
Gennum Zen Digital Wireless Headset (“Zen”) At least by 2004, Globe and Mail Article—Gennum Z-E-N Headset for Bluetooth (SAM-TECH_00098485).
Huseyin Dogan, Trym Holter, & Ingrid Svagard, Trial of a special end user terminal that aids field operators during emergency rescue operations, Proceedings of the 3rd International Iscram China Workshop, Harbin, China, at 4 (Aug. 2008) discusses the PARAT as well (SAM-TECH_00051920-SAM-TECH_00051931).
In-Ear Microphone Speech Data Recognition using HMMs, R. S. Kurcan, M. P. Fargues and R. Vaidyanathan, 2006 IEEE 12th Digital Signal Processing Workshop & 4th IEEE Signal Processing Education Workshop, Teton National Park, WY, USA, 2006.
In-Ear Microphone Speech Data Segmentation and Recognition using Neural Networks, G. Bulbuller, Monique Fargues & Ravi Vaidyanathan, IEEE 12th Digital Signal Processing Workshop and 4th IEEE Signal Processing Education Workshop, 2006.
In-Ear Microphone Techniques For Severe Noise Situations, N. Westerlund, M. Dahl, I. Claesson, Nov. 2005.
Interaction Techniques Using Prosodic Features of Speech and Audio Localization, Alex Olwal & Steven Feiner, Jan. 5, 2011.
Isolated Word Recognition from In-Ear Microphone Data Using Hidden Markov Models (HMM), Remzi Serdar Kurcan, Mar. 2006.
Jawbone Aliph, At least by Sep. 9, 2004, https://www.capecodtimes.com/story/news/20 06/12/24/new-earphones-let-you-go/50845129007 (SAM-TECH_00062054).
Jawbone Aliph, At least by Sep. 9, 2004, https://www.cnet.com/reviews/aliph-jawbone-bluetooth-headset-review/ (SAM-TECH_00060121).
Jawbone Aliph, At least by Sep. 9, 2004, https://www.wired.com/2004/09/military-headset-reaches-masses (SAM-TECH_00062036).
Jawbone Aliph, At least by Sep. 9, 2004, Jawbone User Manual (SAM-TECH_00061992).
Learning-Based Three Dimensional Sound Localization Using a Compact Non- Coplanar Array of Microphones, Kamen Y. Guentchev and John J. Weng, AAAI Technical Report SS-98-02, 1998.
Maico MI26 Tymp/audiometer combo (“Maico MI26”), Aug. 12, 2004 WayBack Machine Capture of Maico's website has an image of the Maico MI26 and lists the Maico MI26 as a product available for purchase. https://web.archive.org/web/20040422090329/http://www.maico-diagnostics.com:80/eprise/main/Maico/US_e n/ProductCategories/LST01_Tympanometers (SAM-TECH_00060329- SAM-TECH_00060331).
Mar. 17, 2006 Wayback Machine capture of Maico's website has a user manual available for the Maico MI26 https://web.archive.org/web/20060317092410/ http:/www.maico- diagnostics.com/eprise/main/Maico/Products/ Files/MI26/1162- 0322REVD.pdf (SAM- Tech_00051168-SAM-TECH_00051215).
Mar. 17, 2006 WayBack Machine Capture of Maico's website discusses frequently asked questions about the Maico MI26 andis' features. https://web.archive.org/web/20060317092109/ http://www.maico-diagnostics.com/eprise/main/Maico/Products/Files/MI24/FAQ.MI24-26.pdf (SAM- Tech_00051250-SAM-TECH_00051251).
Methods of measuring the attenuation of hearing protection devices, E H Berger, The Journal of the Acoustical Society of America vol. 79,6 (1986).
Microphone Array for Headset with Spatial Noise Suppressor, Ivan Tashev, Michael Seltzer & Alex Acero, 2005.
Microphone Array Processing for Robust Speech Recognition, Michael L. Seltzer, Jul. 2003.
Motorola H5 Miniblue Bluetooth Headset, Jan. 14, 2005, https://newatlas.com/ces-2006-bluetooth-innovations-abound-inner-ear-headset- bluetooth-keyboard-and-wireless-ipod-companion/4977/ (SAM-TECH_00060368) (Motorola H5 Miniblue Bluetooth Headset).
Motorola H5 Miniblue Bluetooth Headset, Jan. 14, 2005, https://www.cnet.com/tech/mobile/motorola- h5-miniblue-bluetooth-headset/ (Sam- TECH_00060424) (Motorola H5 Miniblue Bluetooth Headset).
Motorola H5 Miniblue Bluetooth Headset, Jan. 14, 2005, https://www.engadget.com/2006-01-04-motorolas-h5-miniblue-bluetooth- headset.html (SAM-TECH_00060628).
Motorola H5 Miniblue Headset (“Miniblue”), Jan. 2006, Motorola H9 Bluetooth Headset User Manual (SAM-TECH_00060509-14).
Motorola H605, At least by 2006, CNET Article—Motorola H605 Bluetooth Headset Review (SAM-TECH_00098639).
Motorola H605, At least by 2006, Motorola H605 User Manual (SAM-TECH_00098719).
Motorola H605, At least by 2006, PhoneArena Article - Motorola H605 Review (SAM-TECH_00098743).
Motorola Miniblue Press Release (https://web.archive.org/web/20060212115000/ http://www.motorola.com/motoinfo/product/de tails/0,133,00.html) (SAM-TECH_00056060);.
Motorola's Astro XTS 5000 Digital Portable Radio (“Motorola XTS 5000”), At least by Jun. 2002, Motorola's Detailed Service Manuel has a release date in 2003. See Detailed Service Manuel for Astro XTS 5000 VHF/UHF Range 1/Range 2/700-800 MHZ, Digital Portable Radios (2003) (SAM-TECH_00051382-SAM-TECH_00051711);.
Motorola's Astro XTS 5000 Digital Portable Radio (“Motorola XTS 5000”), Jun. 14, WayBack Machine capture of Motorola Inc.'s website contains an image of the Motorola and states that “[t]he top of the line XTS 5000 portable radio is ready and equipped to meet the needs of demanding environments” and that it is “Motorola's newest maximum performance two-way radio.” https://web.archive.org/web/20020614082842/ http://www.motorola.com:80/cgiss/portables/ xts5000.shtml (SAM-TECH_00051718).
Motorola's XTS 2500 Digital Portable Radio (“Motorola XTS 2500”), Motorola XTS 2500's Basic Service Manual dated 2002-2003, see XTS 2500 XTS 1500 MT 1500 700-800 MHz Digital Portable Radios, Basic Service Manual at 70 (SAM_00051287-SAM-TECH_00051374).
Motorola's XTS 2500 Digital Portable Radio (“Motorola XTS 2500”), Nov. 9, 2001, WayBack Machine capture of Motorola's website contains an image of the XTS 2500 and states that “[t]he XTS 2500 portable radio is Motorola's high- performance, small-sized, digital two-way radio.” https://web.archive.org/web/20020804062125/ http://www.motorola.com:80/cgiss/portables/ xts2500.shtml (SAM-TECH_00051258).
Multi-Microphone Correlation-Based Processing for Robust Automatic Speech Recognition, Thomas M. Sullivan, Department of Electrical and Computer Engineering Carnegie Mellon University.
Multi-Microphone Signal Acquisition for Speech Recognition Systems, Kevin Fink, EE 586—Speech Recognition Systems, Dec. 16, 1993.
Multi-sensory microphones for robust speech detection, enhancement and recognition, Zhengyou Zhang, Zicheng Liu, M. Sinclair, A. Acero, L. Deng, J. Droppo, Xuedong Huang, Yanli Zheng, 2004 IEEE International Conference on Acoustics, Speech, and Signal Processing 3 (2004).
NACRE QuietPro, In a Mar. 7, 2013 presentation by Blake Martin of Honeywell Safety Products to the Alberta Industrial Fire Protection Association, Mr. Martin identifies “2005” as the “First commercial success for Quietpro.” (SAM-TECH_00054652).
NACRE QuietPro, In Aug. 2006, Nacre won U.S. Government Contract No W912DQ-06-D-0037 to supply the NACRE QuietPro to the U.S. military. U.S. Government Contract No. W912DQ-06-D-0037 (SAM-TECH_00055735).
NACRE QuietPro, In proceedings before the U.S. Trademark Trial and Appeal Board, Nacre stated that it “has used in commerce with the United States, long since prior to Apr. 28, 2006, the registered trademark QUIETPRO on one or more of headphones, earphones . ... ” Nacre AS v. Silynx Communications, Inc., Sep. 4, 2007 Notice of Opposition. (SAM-TECH_00054696).
NACRE QuietPro, Mar. 9, 2005, Honeywell Quietpro QP100ex Mar. 2013 presentation (SAM-TECH_00063985);.
NACRE QuietPro, Mar. 9, 2005, IEEE Explore Article (SAMTECH_00063687).
NACRE QuietPro, Mar. 9, 2005, NACRE QuietPro User Manual v2.0 (SAMTECH_00055181).
NACRE QuietPro, Mar. 9, 2005, New Scientist Article (SAMTECH_00064068).
NACRE QuietPro, Mar. 9, 2005, SoldierMod Article (SAM-TECH_00065729).
NACRE QuietPro, Mar. 9, 2005, Article posted at: https://www.tu.no/artikler/quietproverner-og-forsterker-horselen/261960 (SAM-TECH_00097600).
NACRE QuietPro, Mar. 9, 2005, WayBack Machine capture of Nacre's website contains an image of the NACRE QuietPro and states that “Nacre has secured MNOK 27,5 from a consortium led by Ferd Venture” and that “[m]ost of the money will be spent to boost efforts within sales and marketing of QUIETPRO in the global military market.”.
Nacre's PARAT earplug (“Parat”), 1999 article published by one of the PARAT's designers Georg E. Ottensen, discusses the PARAT system and states, “[a]n active ear terminal is beeing designed at SEVTEF Telecom and informatics. The acronym of the consept is PARAT - Personal Active Radio/Audio Terminal.” Georg E. Ottesen, Active noise reduction in an ear terminal, The Journal of the Acoustical Society of America 105, 1300 (1999); https://doi.org/10.1121/1.424828, SINTEF Telecom and Informatics, N-7465 (SAM-TECH_00051952—SAM-TECH_00051955);.
Nacre's PARAT earplug (“Parat”), Jan. 2004 publication by Fredrik Vraalsen et al., describes how “[p]articular attention has been given to voice interaction in noisy industrial scenarios, utilising the PARAT earplug.” Fredrik Vraalsen, Trym Holter, Ingrid Storruste Svagard, and Oyvind Kvennas, A Multimodal Context Aware Mobile Maintenance Terminal For Noisy Environments, Sintef Ict, N-7465 Trondheim, Norway, 79, 79 (Jan. 2004) (SAM-TECH_00051938—SAM-TECH_00051951);.
Noise attenuation and proper insertion of earplugs into ear canals, Markku Toivonen, Rauno Pääkkönen, Seppo Savolainen, Kyösti Lehtomäki, The Annals of occupational hygiene, vol. 46,6 (2002): 527-530.
Oct. 29, 2019 on BusinessToday.in states the AirPods Pro require Apple devices running iOS 13.2 or later, iPadOS 13.2 or later, watchOS 6.1 or later, tvOS 13.2 or later, or macOS Catalina 10.15.1 or later. https://www.businesstoday.in/technology/lau nch/story/apple-airpods-pro-with-noise-cancellation-launched-check-out-price-in-india features-235269-2019-10-29 (SAM-TECH_00061346-SAM-TECH_00061349);.
Oct. 31, 2019 article published by Charlie Sorrel discusses the Ear tip fit test for the AirPods Pro in the iPhone settings. https://www.cultofmac.com/662548/airpods-pro-ear-tip-fit-test/; (SAM-TECH_00056870—SAM-TECH_00056881);
Olympus WS-320M, At least by Nov. 25, 2005 (Olympus WS-320M) https://web.archive.org/web/20051125000137mp_, http://www.olympusamerica.com/cpg_se ction/cpg_vr_digitalmusic.asp (SAM-TECH_00051760).
Olympus WS-320M, At least by Nov. 25, 2005 (Olympus WS-320M) https://web.archive.org/web/20060314095402/, http://www.olympusamerica.com/cpg_sectio n/product.asp?product=1195&fl=2 (SAM-TECH_00051767; SAM-TECH_00051753).
Olympus WS-320M, At least by Nov. 25, 2005 Olympus WS-320M Instruction Manual (SAM-TECH_00051833).
Optimal Feedback Control Formulation of the Active Noise Cancellation Problem: Pointwise and Distributed, Kambiz C. Zangi, Rle Technical Report No. 583, Research Laboratory of Electronics Massachusetts Institute of Technology, May 1994.
Peltor Lite-Com II, At least by 1999, Peltor Lite-Com II Manual (SAM-TECH_00099254).
Peltor Lite-Com II, At least by 1999, Peltor Lite-Com II Brochure (SAM-TECH_00099203).
Performance of dual microphone in-the-ear hearing aids, Michael Valente, Gerald Schuchmant, Lisa G. Potts & Lucille B. Beck, Journal of the American Academy of Audiology, 2000.
Plantronics Discovery 655, At least by 2006, CNET Article—Plantronics Discovery 655 Bluetooth Headset Review (SAM-TECH_00099287).
Plantronics Discovery 655, At least by 2006, Plantronics Discovery 655 Brochure (SAM-TECH_00099296).
Plantronics Discovery 655, At least by 2006, Plantronics Discovery 655 User Guide (SAM-TECH_00099344).
Plantronics Discovery 655, At least by 2006, Silicon Poip Culture Article—Plantronics Discovery 655 (SAM-TECH_00099387).
Plantronics MX200, At least by 2006, Plantronics MX200 Brochure (SAM-TECH_00099419).
Plantronics MX200, At least by 2006, Plantronics MX200 User Guide (SAM-TECH_00099435).
Plantronics MX200, At least by 2006, Plantronics MX250 User Guide (SAM-TECH_00099461).
PocketLint Article—Zen Gennum Bluetooth Headset (SAM-TECH_00098490).
Preferred methods for measuring hearing protector attenuation, Elliott Berger, International Congress on Noise Control Engineering 2005, INTERNOISE 2005.
Products of Interest, Project Muse, Computer Music Journal, vol. 30, No. 3, Fall 2006.
Reducing the Negative Effects of Ear-Canal Occlusion, Samuel S. Job, Department of Electrical and Computer Engineering Brigham Young University, 2002.
Research in Motion's BlackBerry 7520 (“BlackBerry”), At least by 2004, Blackberry 7520 Wireles Handheld Model No. RAL11IN, Version 4.1 User Guide, last modified Mar. 6, 2006 (SAM-TECH_00054461—SAM-TECH_00054618);.
Research in Motion's BlackBerry 7520 (“BlackBerry”), Jun. 28, 2006 WayBack Machine capture of the BlackBerry lists it for sale and describes the Blackberry as a “strong addition to the product line-up.” https://web.archive.org/web/20060628035351 /http://www.blackberry-7520.com (Sam-TECH_00054619; SAM-TECH_00054624; SAM-TECH_00054622);.
Research in Motion's BlackBerry 7520 (“BlackBerry”), At least by 2004, BlackBerry Wireless Handheld Getting Started Guide (SAM-TECH_00228841).
Research in Motion's BlackBerry 7520 (“BlackBerry”), Nextel Services Guide for the Blackberry is dated the year 2004; (SAM-TECH_00226708).
SeboTek Hearing Systems' PAC (Post Auricular Canal) Instrument (“Sebotek”), Mar. 19, 2003 WayBack Machine capture of SeboTek's website contains a description of the PAC, which notes that “[t]he PAC is an exciting new hearing system by SeboTek that is significantly different from traditional hearing aids. If offers deep canal fitting, superior acoustics, incredible discreetness, and unmatched comfort.”.
/http://www.sebotek.com:80/ (SAM-TECH_00052377);.
SeboTek Hearing Systems' PAC (Post Auricular Canal) Instrument (“Sebotek”), May 26, 2007 WayBack Machine capture of SeboTek's website contains a description of the PAC, and notes that “Prior to 2003, depending on the level of hearing loss, consumers could choose between four primary styles, none of which offered superior sound quality, comfort or cosmetic appeal. All that changed in 2003, when SeboTek introduced the PAC Voice-QTM hearing instrument, the first-ever speaker-in-the-canal device.” https://web.archive.org/web/20070526135524 /http://www.sebotek.com:80/OurProducts/our Products.html (SAM-TECH_00052392);.
SeboTek Hearing Systems' PAC (Post Auricular Canal) Instrument (“Sebotek”), Oct. 6, 2003 post by Bruce Gefvert, Director of Sales and Marketing at SeboTek Hearing Systems, on audiologyonline.com discusses the PAC, and states “PAC refers to Post Auricular Canal, an entirely new style of hearing aid that is intended to provide hearing professionals with one more option for treating hearing loss in the mild to severe ranges.” https://www.audiologyonline.com/ask-the- experts/sebotek-pac-post-auricular-canal- 601 (SAM_00052353—SAM-TECH_00052356);.
SeboTek Hearing Systems' PAC (Post Auricular Canal) Instrument (“Sebotek”), Publication by King Chung in 2004 mentions that “SeboTek VoiceQ and Vivatone have recently launched newly designed behind-the- ear or postauricular canal (PAC, as SeboTek preferred) hearing aids that have receivers situated in the ear canal.” See King Chung, Challenges and recent developments in hearing aids. Part II. Feedback and occlusion effect reduction strategies, laser shell manufacturing processes, and other signal processing technologies, 8 Trends Amplif. 125, 150 (2004), available at https://www.ncbi.nlm.nih.gov/pmc/articles/P MC4111464/pdf/10.1177_108471380400800 402.pdf (SAM-TECH_00062067—SAM-TECH_00062106).
Sensaphonics 3D Active Ambient In-Ear Monitor System, At least by 2006, Products of Interest Article (SAM-TECH_00096723).
Sensaphonics 3D Active Ambient In-Ear Monitor System, At least by 2006, Sensaphonics 3D Active Ambient In- Ear Monitor System User Guide (SAM-TECH_00100046).
Sensaphonics 3D Active Ambient In-Ear Monitor System, At least by 2006, Sensaphonics 3D Active Ambient IEM System Article (SAM-TECH_00100065).
Silynx QuietOps, Oct. 4, 2007 Applicant's Answer to Opposer's Notice of Opposition (SAM-TECH_00052371).
Silynx QuietOps, https://defense- update.com/20080513_c4ops.html (SAM-TECH_00057150).
Silynx QuietOps, QuietOps Pocket Guide (Rev. 2.00) (Silynx QuietOps).
Small-footprint keyword spotting using deep neural networks, G. Chen, C. Parada and G. Heigold, 2014 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Florence, Italy, 2014, pp. 4087-4091.
Sonar-operator active noise reduction insert-earphone: Prototype preliminary test and evaluation, Joseph S. Russotti, Naval Submarine Medical Research Laboratory, Report No. 1225.
Sonomax's Sonomax: SonoCustom and SonoPass (“Sonomax”), Apr. 8, 2006 WayBack Machine capture of Sonomax's website contains an image of the Sonomax and states that “[t]ens of Thousands of people around the world give the SonoCustom a big thumbs up for comfort.” https://web.archive.org/web/20060408170243 /http://sonomax.com.au/index.cfm/aboutus/so nomax_solution/ (Sam- TECH_00052472; SAM-TECH_00052998);.
Sonomax's Sonomax: SonoCustom and SonoPass (“Sonomax”), Jun. 15, 2006 WayBack Machine capture of Sonomax's website contains frequently asked questions about the Sonomax and states that SonoPass, our proprietary Windows-based software, drives the fitting process and provides immediate proof of functionality. https://web.archive.org/web/20060615054356 /http://www.sonomax.com.au:80/index.cfm/fa q/ (SAM-TECH_00052643).
Sonomax's Sonomax: SonoCustom and SonoPass (“Sonomax”), Apr. 8, 2006 WayBack Machine capture of Sonomax's website contains an image of the SonoCustom and describes it as a “cost effective, comfortable and resusable earpiece.” https://web.archive.org/web/20060408165744 /http://sonomax.com.au:80/index.cfm/fittingp rocess/ (SAM-TECH_00052436);.
Sonomax's Sonomax: SonoCustom and SonoPass (“Sonomax”), Apr. 8, 2006 WayBack Machine capture of Sonomax's website contains an image of the Sonomax and states that “[t]he Sonomax is a hearing protection system that combines a uniquely designed earpiece, the SonoCustom, with an optimised hardware and software application, called SonoPass.” https://web.archive.org/web/20060408170221 /http://sonomax.com.au:80/index.cfm/testingp rocess/ (SAM-TECH_00052425);.
Sonomax's Sonomax: SonoCustom and SonoPass (“Sonomax”), Jun. 15, 2006 WayBack Machine capture of Sonomax's website contains an image of the Sonomax and states that “application provides employers the unique ability to quantify and track hearing protection performance and produce detailed reports.” https://web.archive.org/web/20060615054658 /http://www.sonomax.com.au/index.cfm/testi ngprocess/ (SAM-TECH_00052589);.
Sony S700 Walkman, At least by Oct. 13, 2006, EAFIT Article—The Sony Walkman (SAM-TECH_00099514).
Sony S700 Walkman, At least by Oct. 13, 2006, IDG Article—Sony's New Walkman Players Pack Noise Canceling (SAM-TECH_00099533).
Sony S700 Walkman, At least by Oct. 13, 2006, Sony Walkman User Manual (SAM-TECH_00099557).
Sony S700 Walkman, At least by Oct. 13, 2006, Stuff Article—Sony NW-S700 Review (SAM-TECH_00099579) Sound Source Localization and Separation, Biniyam Tesfaye Taddese, Mathematics, Statistics, and Computer Science Honors Projects (2006).
Speaker Turn Segmentation Based on Between-Channel Differences, Daniel P.W. Ellis & Jerry C. Liu, LabROSA, Dept. of Electrical Engineering, Columbia University.
Spectral analysis of speech by linear prediction, J. Makhoul, IEEE Transactions on Audio and Electroacoustics, vol. 21, No. 3, pp. 140-148, Jun. 1973.
Speech Input Hardware Investigation for Future Dismounted Soldier Computer Systems, Jeffrey C. Bos & David W. Tack, DRDC Toronto CR 2005-064, May 1, 2005.
Speech Modeling with Magnitude-Normalized Complex Spectra and Its Application to Multisensory Speech Enhancement, A. Subramanya, Z. Zhang, Z. Liu and A. Acero, 2006 IEEE International Conference on Multimedia and Expo, Toronto, ON, Canada, 2006, pp. 1157-1160.
Speech Recognition in Severely Disturbed Environments Combining Ear-Mic and Active Noise Control, N. Westerlund, M. Dahl, I. Claesson, Published 2002, Engineering, Computer Science.
Survey of the Speech Recognition Techniques for Mobile Devices, Dmitry Zaykovskiy, Department of Information Technology, SPECOM'2006, St. Petersburg, Jun. 2006.
Techniques and applications for wearable augmented reality audio, Härmä, Aki & Turku, Julia & Tikander, Miikka & Karjalainen, M & Lokki, Tapio & Nironen, H & Vesa, Sampo (2003).
The Effect of Hearing Aid Microphone Location on the Intelligibility of Hearing Aid—Transduced Speech, John Robert Franks, Dec. 1975.
Using Audio-Based Signal Processing to Passively Monitor Road Traffic, Orla Duffner, Centre for Digital Video Processing and School of Electronic Engineering Dublin City University, Jul. 2006.
Verifying the attenuation of earplugs in situ: Method validation using artificial head and numerical simulations, Annelies Bockstael, Bram De Greve, Timothy Van Renterghem, Dick Botteldooren, Wendy D'haenens, Hannah Keppler, Leen Maes, Birgit Philips, Freya Swinnen, Bart Vinck, The Journal of the Acoustical Society of America; 124 (2): 973-981, Aug. 1, 2008.
Related Publications (1)
Number Date Country
20230197080 A1 Jun 2023 US
Provisional Applications (1)
Number Date Country
61098914 Sep 2008 US
Continuations (5)
Number Date Country
Parent 17736180 May 2022 US
Child 18106606 US
Parent 17203731 Mar 2021 US
Child 17736180 US
Parent 16671689 Nov 2019 US
Child 17203731 US
Parent 14846994 Sep 2015 US
Child 16671689 US
Parent 12560097 Sep 2009 US
Child 14846994 US