This disclosure pertains to systems, methods, and media for determination of movement direction.
Determining user movement direction may be useful. However, determining user movement direction may be difficult, for example, due to artifacts in measured sensor data due to the user's movement, due to confounding sensor data due to a user looking around while moving, and the like.
Throughout this disclosure, including in the claims, the terms “speaker,” “loudspeaker” and “audio reproduction transducer” are used synonymously to denote any sound-emitting transducer (or set of transducers). A typical set of headphones includes two speakers. A speaker may be implemented to include multiple transducers (e.g., a woofer and a tweeter), which may be driven by a single, common speaker feed or multiple speaker feeds. In some examples, the speaker feed(s) may undergo different processing in different circuitry branches coupled to the different transducers.
Throughout this disclosure, including in the claims, the expression performing an operation “on” a signal or data (e.g., filtering, scaling, transforming, or applying gain to, the signal or data) is used in a broad sense to denote performing the operation directly on the signal or data, or on a processed version of the signal or data (e.g., on a version of the signal that has undergone preliminary filtering or pre-processing prior to performance of the operation thereon).
Throughout this disclosure including in the claims, the expression “system” is used in a broad sense to denote a device, system, or subsystem. For example, a subsystem that implements a decoder may be referred to as a decoder system, and a system including such a subsystem (e.g., a system that generates X output signals in response to multiple inputs, in which the subsystem generates M of the inputs and the other X−M inputs are received from an external source) may also be referred to as a decoder system.
Throughout this disclosure including in the claims, the term “processor” is used in a broad sense to denote a system or device programmable or otherwise configurable (e.g., with software or firmware) to perform operations on data (e.g., audio, or video or other image data). Examples of processors include a field-programmable gate array (or other configurable integrated circuit or chip set), a digital signal processor programmed and/or otherwise configured to perform pipelined processing on audio or other sound data, a programmable general purpose processor or computer, and a programmable microprocessor chip or chip set.
Methods, systems, and media for determining user movement direction are provided. In some embodiments, the method may involve obtaining, using a control system, user acceleration data associated with a user. The method may involve determining, using the control system, a movement period associated with a movement activity of the user using the user acceleration data, wherein the movement period indicates a duration between two sequential movements by the user. The method may involve determining, using the control system, a movement direction corresponding to the movement activity using the user acceleration data based on a direction of acceleration orthogonal to the movement direction in which at least a portion of the user acceleration data is anti-periodic over a period of time corresponding to the movement period.
In some examples, the at least the portion of the user acceleration data is anti-periodic over the period of time corresponding to the movement period such that a cross-correlation of the at least the portion of the user acceleration data with a version of the at least the portion of the user acceleration data delayed by the movement period is negative.
In some examples, the at least the portion of the user acceleration data that is anti-periodic includes a local maximum and a local minimum occurring after a subsequent time interval corresponding with the movement activity.
In some examples, the movement activity comprises walking or running, and wherein the movement direction comprises a direction the user is walking or running.
In some examples, the method further involves transforming the user acceleration data from a user-centered coordinate frame to a fixed coordinate frame, to produce transformed user acceleration data, wherein the movement period is determined using the transformed user acceleration data. In some examples, transforming the user acceleration data comprises using user head orientation data obtained from one or more gyroscopes. In some examples, the head orientation data is used to identify the movement direction.
In some examples, determining the movement period comprises using data associated with a vertical component of the user acceleration data. In some examples, determining the movement period further comprises: providing the data associated with the vertical component of the user acceleration data to a plurality of narrowband filters, each narrowband filter of the plurality of narrowband filters associated with a different frequency; and generating a prediction of a movement frequency based on outputs of the plurality of narrowband filters, wherein the prediction of the movement frequency is used to determine the movement period. In some examples, using the prediction of the movement frequency to determine the movement period comprises identifying a period of an output of a selected narrowband filter of the plurality of narrowband filters, wherein the selected narrowband filter of the plurality of narrowband filters generates a largest output signal.
In some examples, the at least a portion of the user acceleration data that is anti-periodic over the period of time corresponding to the movement period comprises one or more horizontal components of the user acceleration data. In some examples, identifying the direction of acceleration orthogonal to the movement direction comprises determining a cross-covariance vector corresponding to the data associated with the one or more horizontal components of the user acceleration data with a version of the data associated with the one or more horizontal components of the user acceleration data delayed by the movement period. In some examples, the cross-covariance vector is determined by: determining, for each of a plurality of sample times, an outer-product vector from two or more horizontal components of the user acceleration data and two or more horizontal components of the user acceleration data delayed by the movement period; and combining two or more of the outer-product vectors from two or more of the plurality of sample times to determine the cross-covariance vector. In some examples, the two or more of the outer-product vectors are combined using a weighting, and wherein outer-product vectors associated with more recent sample times are weighted more heavily than outer-product vectors associated with less recent sample times. In some examples, the weighting comprises an exponential decay function. In some examples, the method further involves applying a bandpass filter to the one or more horizontal components of the user acceleration data prior to identifying the direction of acceleration orthogonal to the movement direction.
In some examples, the movement direction is determined by: determining an intermediate angle corresponding to twice the angle associated with the movement direction; and determining the angle associated with the movement direction based on the intermediate angle. In some examples, determining the angle associated with the movement direction based on the intermediate angle comprises: halving the intermediate angle to determine a candidate movement direction; and selecting the angle associated with the movement direction as the candidate movement direction or 180 degrees from the candidate movement direction based on head orientation data. In some examples, determining the angle associated with the movement direction based on the intermediate angle comprises: determining a difference between the intermediate angle and an angle associated with an orientation of a head of the user; and determining the angle associated with the movement direction based on the difference.
In some examples, the user acceleration data is obtained from one or more accelerometers disposed in or on a user device configured to be carried by or worn by the user. In some examples, the user device comprises headphones. In some examples, the method further involves rendering audio content to be presented by the headphones based on the identified movement direction. In some examples, the audio content is rendered to have a spatial perception of being centered with the identified movement direction.
Some or all of the operations, functions and/or methods described herein may be performed by one or more devices according to instructions (e.g., software) stored on one or more non-transitory media. Such non-transitory media may include memory devices such as those described herein, including but not limited to random access memory (RAM) devices, read-only memory (ROM) devices, etc. Accordingly, some innovative aspects of the subject matter described in this disclosure can be implemented via one or more non-transitory media having software stored thereon.
At least some aspects of the present disclosure may be implemented via an apparatus. For example, one or more devices may be capable of performing, at least in part, the methods disclosed herein. In some implementations, an apparatus is, or includes, an audio processing system having an interface system and a control system. The control system may include one or more general purpose single- or multi-chip processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) or other programmable logic devices, discrete gates or transistor logic, discrete hardware components, or combinations thereof.
Details of one or more implementations of the subject matter described in this specification are set forth in the accompanying drawings and the description below. Other features, aspects, and advantages will become apparent from the description, the drawings, and the claims. Note that the relative dimensions of the following figures may not be drawn to scale.
Like reference numbers and designations in the various drawings indicate like elements.
It may be useful to determine a movement direction of a user engaged in a movement activity, such as walking or running. For example, the movement direction may be useful for presenting content in a manner that is dependent on the movement direction. This content may include visual content (e.g., in an instance in which the user is viewing content presented by a virtual reality (VR) or augmented reality (AR) headset, or the like), audio content, etc. As a more particular example, certain audio content (e.g., music, a podcast, a radio show, a movie soundtrack, etc.) may involve a spatial component such that the audio content is to be rendered (e.g., by headphones, speakers, etc.) in a manner in which components of the audio content are perceived to be at different spatial locations with respect to the listener. In an instance in which a user is in motion, it may be difficult to render the audio content in a manner that is faithful to the creator's intent without knowledge of the movement direction of the user. By way of example, in an instance in which a component of the audio content (e.g., a speaker, the main component of music, etc.) is to be positioned in a manner that is centered with respect to the user, it may be difficult to correctly center the audio component while the user is in motion (e.g., walking, running, etc.) without knowledge of the movement direction. Accordingly, determining movement direction of a user in motion may be useful.
However, it can be difficult to accurately determine movement direction. For example, some conventional techniques may make use of motion sensors, which may include accelerometers, gyroscopes, and/or magnetometers, on a device that is worn by the user, e.g., on a head of the user. By way of example, such motion sensors may be disposed in or on headphones, earbuds, glasses, AR/VR headsets, etc. that are worn on a head of the user. In such instances, if a user is in motion (e.g., walking, running, or the like) in a particular movement direction, while looking around in one or more directions other than the movement direction, estimates of the movement direction may be rendered inaccurate due to motion sensor information that is collected with respect to the user's head, which is moving in directions other than the movement direction. In the case in which audio content is to be rendered with respect to the center of the user (e.g., the user's torso), this may cause artifacts in the spatial rendering, for example, as the user looks around while walking.
Disclosed herein are systems, methods, and media for determining movement direction. The techniques described herein allow the movement direction to be determined irrespective of a head orientation, such that the movement direction of a user in motion may be accurately determined, even in instances in which the head of the user is directed in a direction other than the movement direction. The techniques may be implemented regardless of where the motions sensors (e.g., accelerometers, gyroscopes, and/or magnetometers) are disposed, e.g., in or on a head-worn device, in or on a device carried by the user in their pocket, etc.
In some implementations, the movement direction may be determined by obtaining user acceleration data of a user (e.g., a user engaging in a movement activity). A movement period may be determined based on the user acceleration data, where the movement period indicates a time duration between two sequential movements of the user. By way of example, in an instance in which the movement activity being performed by the user is walking or running, the movement period may correspond to the time duration between two sequential steps the user takes while walking or running. The movement direction may then be determined based on a direction of acceleration that is orthogonal to the movement direction, where the orthogonal direction corresponds to a direction of acceleration over which at least a portion of the user acceleration data is anti-periodic over the movement direction. It should be noted that, as used herein, a “direction” generally refers to a vector associated with angles corresponding to axes of a given coordinate frame.
By way of example, in an instance in which the movement activity is walking or running, a vertical component of acceleration (which may correspond to, e.g., a head or other body part of the user moving up and down) may be substantially periodic with respect to the movement period. However, there may be a horizontal component of acceleration, which may substantially correspond to lateral motion of the user's head to the left and right during the movement activity, that is substantially anti-periodic with respect to the movement period. In other words, while the vertical component of acceleration may undergo a full cycle over the movement period, the horizontal component of acceleration may undergo only a half cycle (e.g., corresponding to head movement to the left or right) over the movement period. It should be noted that, as used herein, the term “anti-periodic” may be understood to mean that a cross-correlation of a signal with a signal delayed by a determined movement period is negative. In other words, the movement direction may be considered to be either the direction with the maximum amount of periodic acceleration, or, put another way, the direction orthogonal to the direction with the maximum amount of anti-periodic acceleration. Additionally or alternatively, “anti-periodic” may be understood to mean that both a local maximum and a local minimum (e.g., corresponding to head movement to the left and right, respectively) occur within a time interval corresponding to the movement period.
It should be noted that, although the movement activity is generally described herein with examples relating to walking or running, any suitable movement activity that includes periodic body motion may be considered. Other examples of movement activity include skipping, jogging, cycling, climbing, skating, cross-country skiing, or the like. Additionally, it should be noted that although the techniques described herein generally describe vertical components of acceleration as being periodic with respect to a given movement period, and horizontal components of acceleration as being anti-periodic with respect to the movement period, in some cases, this may be reversed. For example, in some instances, the horizontal components of acceleration may be periodic with respect to a movement period depending on the movement activity being considered.
Additionally, it should be noted that, in some implementations, subsequent to determining a movement direction, content may be rendered based on the movement direction. For example, audio content may be rendered such that at least a portion of the audio content is perceived as being positioned substantially in alignment with an estimated orientation of the user's torse torso while the user is moving in the movement direction. In some embodiments, the rendered audio content may be played back via, e.g., headphones, speakers, etc.
In the example shown in
The angle α is the angle 130 between axis 121 and axis 122 and represents a difference between the direction (αlook) the subject is looking relative to the direction (αmove) the subject is walking. Direction 123 represents the direction orthogonal to walking direction 102.
Angle α may vary over time, even as the subject's movement direction remains constant, for example, due to the subject looking around while walking in a near-constant direction. In some implementations, the techniques described herein may effectively determine a, and accordingly, a movement direction that is unaffected by the looking direction of the subject. Because acceleration data may be collected using one or more accelerometers disposed in or on a head-mounted device (e.g., a heads-up display, headphones, etc.), acceleration data may typically include acceleration artifacts associated with the subject's head movement. Accordingly, the techniques described herein may allow for a movement direction to be identified irrespective of looking direction.
In some implementations, a movement direction may be identified based on user acceleration data. For example, the user acceleration data may include multiple components, such as a vertical component (e.g., corresponding to movement in the up and down direction), a forward component (e.g., corresponding to movement in the front-back direction), and/or a lateral component (e.g., corresponding to movement in the left-right direction). It should be noted that, in some cases, acceleration data with respect to a first component may be periodic with respect to a movement period that defines a time duration between two sequential movements (e.g., a step period that defines a time during two sequential steps). For example, the acceleration data associated with the vertical component (e.g., the up and down direction) may be periodic with respect to the movement period as a user's head moves up and down while walking or running. In some cases, acceleration data with respect to a second component may be anti-periodic with respect to the movement period. For example, the acceleration data associated with the lateral component (e.g., corresponding to movement in the left-right direction) may be anti-periodic with respect to the movement period.
In some implementations, a movement direction associated with a user's movement activity may be determined using user acceleration data. The acceleration data may be obtained using one or more accelerometers, e.g., disposed in or on a device carried by or worn by the user. For example, the one or more accelerometers may be associated with a mobile phone in a pocket of the user, disposed in or on a wearable device worn by the user (e.g., headphones, ear buds, smart glasses, a heads-up display, a smart watch, a mobile phone worn in an arm band or waist band, or the like), etc. In some implementations, the user acceleration data may be utilized in connection with user head orientation information, as will be described below in connection with
Process 300 can begin at 302 by obtaining user acceleration data, where the user acceleration data is associated with a user movement activity. The movement activity may include any suitable type of activity that involves acceleration along at least one axis that is periodic as a function of time. Examples of movement activities include walking, running, or the like. The user acceleration data may be obtained from one or more accelerometers disposed in or on a device worn or carried by the user, such as a mobile phone, a wearable device (e.g., headphones, smart glasses, a heads-up display, AR/VR glasses, etc.), or the like. Note that, in some implementations, the user acceleration data may be processed prior to use. For example, in some embodiments, the user acceleration data may be transformed from a coordinate frame centered around the user's head or body to a fixed coordinate frame, as shown in and described below in connection with
At 304, process 300 may determine a movement period associated with a movement activity of the user. The movement period may indicate a duration of time between sequential movements of the movement activity. For example, in an instance in which the movement activity is a walking or running activity, the movement period may indicate a period between sequential steps taken by the user during the walking or running activity. Example techniques for determining the movement period are shown in and described below in connection with
At 306, process 300 may determine a movement direction corresponding to the movement activity based on a direction of acceleration orthogonal to the movement direction in which at least a portion of the user acceleration data is anti-periodic over a period of time corresponding to the movement period. For example, the movement direction may be a direction orthogonal to the direction of acceleration orthogonal to the movement direction and which is closest to a looking direction of the user (e.g., where the looking direction is identified based at least in part on user head orientation data). Example techniques for determining the movement direction are shown in and described below in connection with
In some implementations, user acceleration data may be transformed from a user-centered coordinate frame to a fixed coordinate frame (e.g., fixed with respect to the external environment, ground, etc.). For example, the user acceleration data may be transformed from a coordinate frame centered with a user's head based on acceleration data and/or orientation data obtained from sensors disposed in or on a head-worn device (e.g., as shown in and described above in connection with
Process 400 can begin at 402 by obtaining user acceleration data with respect to a user-centered coordinate frame. The user-centered coordinate frame may be represented by the coordinate system (XU, YU, ZU). In some implementations, the user-centered coordinate frame may be aligned to the user's head such that, e.g., the ZU axis is directed upward out of the user's head, the XU axis is directed in a front-back direction with respect to the user's eyes, and the YU axis is directed in a left-right direction with respect to the user's ears. In an instance in which the user acceleration data is measured in Gs (e.g., equivalent to a measurement in m/sec2 and scaled by 1/9.8), the user acceleration data may be represented by, for a given sample k:
At 404, process 400 can obtain user orientation data. The user orientation data may be obtained from an orientation tracking device, which may be disposed in or on a wearable device worn on, e.g., the user's head. The user orientation data may be obtained using one or more gyroscopes and/or magnetometers (e.g., such as those disposed in or on orientation tracking device 103 of
At 406, process 400 can transform the user acceleration data from the user-centered coordinate frame to a fixed coordinate frame (e.g., as defined by the XF, YF, and ZF axes shown in and described above in connection with
As described above, in some implementations, the movement direction may be determined based at least in part on a movement period associated with a movement activity. For example, the movement direction may be determined based on a direction of acceleration orthogonal to the movement direction, where the orthogonal direction is a direction of acceleration for which at least a portion of acceleration data is anti-periodic over the movement period. In some implementations, the movement period may be determined by providing components of the user acceleration data (which may be the user acceleration data with respect to a fixed coordinate frame, as described above in connection with
Process 500 can begin at 502 by obtaining user acceleration data, where the user acceleration data includes vertical and horizontal components of acceleration. For example, the vertical components of acceleration may correspond to acceleration data with respect to the Z axis, and the horizontal components of acceleration may correspond to acceleration data with respect to the X and Y axes.
At 504, process 500 can apply one or more bandpass filters to the acceleration data associated with the vertical component of the user acceleration data. Any suitable number of bandpass filters may be used, such as one, two, five, ten, twenty, or the like. Each bandpass filter of the one or more bandpass filters may have a different center frequency. In some implementations, the center frequency may be within a range of about 1 Hz to 3 Hz. Examples of center frequencies include 1 Hz, 1.05 Hz, 1.2 Hz, 1.23 Hz, 1.5 Hz, 1.8 Hz, 1.89 Hz, 2 Hz, 2.34 Hz, 2.7 Hz, 2.89 Hz, 3 Hz, or the like. In instances in which two or more bandpass filters are used, the center frequencies of the two or more bandpass filters may be linearly distributed (e.g., separated by a fixed frequency difference) or not linearly distributed. In some implementations, gains may be different for bandpass filters having different center frequencies. For example, in some embodiments, gains may be relatively higher for a bandpass filter with a relatively lower center frequency than for a bandpass filter with a relatively higher center frequency.
Turning to
Referring back to
Turning to
Referring back to
It should be noted that, in some implementations, process 500 may be configured to determine whether a movement activity that includes periodic movements is occurring based on the output of the one or more bandpass filters. For example, process 500 may determine that a movement activity that includes periodic movements is occurring responsive to the amplitude of at least one bandpass filter exceeding a predetermined threshold. Note that, in some embodiments, responsive to determining that no movement activity with periodic movements is occurring, process 500 may terminate. Additionally, in some embodiments, a movement direction may not be determined.
At 508, process 500 can identify the movement period based on the initial prediction of the movement frequency. In some implementations, process 500 may identify the movement period as the time duration between cycles of the output of the bandpass filter associated with the movement frequency (e.g., the bandpass filter that generates the highest amplitude output). In other words, the determination of the movement period may be considered a refinement of the initial prediction of the movement frequency.
Turning to
As described above, in some implementations, the movement direction may be determined based on a direction of acceleration orthogonal to the movement direction, where the direction of acceleration orthogonal to the movement direction has components of the acceleration data that are anti-periodic with respect to the movement period. In some instances, the components of the acceleration data that are anti-periodic may be the horizontal components of acceleration (which may include acceleration data with respect to the X and Y axes). In some implementations, the movement direction may be determined by determining a cross-covariance of the components of acceleration that are anti-periodic with respect to the movement period and the components of acceleration that are anti-periodic delayed by the movement period. The movement direction may then be determined based on the cross-covariance. In some embodiments, the cross-covariance may be used to determine an angle (generally referred to herein as “β”) that is twice the angle associated with the movement direction (generally referred to herein as “αmove”). In some implementations, the movement direction may then be determined by halving the value of β and then selecting αmove as the angle that is half the value of β and closest to the head orientation direction in order to disambiguate between
and the angle λ radians away from
Process 1000 can begin at 1002 by obtaining the horizontal components of the user acceleration data and the movement period associated with a user's movement activity. The movement period may be obtained based on acceleration data associated with the vertical component, as shown in and described above in connection with
At 1004, process 1000 can optionally apply a filter to the horizontal components of the user acceleration data. For example, the filter may be a bandpass filter that effectively removes low-frequency acceleration components and attenuates high-frequency components. As a more particular example, the bandpass filter may be configured to attenuate low-frequency components below about 0.5 Hz, below about 1 Hz, below about 5 Hz, or the like. Additionally or alternatively, the bandpass filter may be configured to attenuate high-frequency components above about 8 Hz, above about 10 Hz, above about 20 Hz, or the like. An example 1100 of such a filter is shown in
Referring back to
Note that, in the equation above, the subscripts indicate extraction of elements of the AccH vector. For example, a subscript of “1” indicates extraction of the element of the AccH vector associated with the X axis, and a subscript of “2” indicates extraction of the element of the AccH vector associated with the Y axis.
After determination of the product vector C(k), process 1000 may determine the delay-offset cross-covariance vector, generally represented herein as X(k), using the current and recent past product vectors. In some implementations, more recent out-product vectors (e.g., more recent values of the vector C(k)) may be weighted more heavily using a weight w(j). For example, the cross-covariance vector may be determined by:
In the equation above, the weights w(j) may be selected to be higher for more-recent out-product vectors of vector C. For example, the weights w(j) may be relatively higher for lower values of j. In one example, the weights w(j) may be determined using an exponential function such that the value of the weight decays as a function of time or sample number. By way of example, w(j) may be determined by:
In the instance in which an exponential function is used to determine the weights w(j), the cross-covariance vector X(k) may be determined in an iterative manner. For example, utilizing the example exponential function given above, X(k) may be determined by:
At 1008, process 1000 may utilize the cross-covariance to determine the movement direction associated with the movement activity. For example, in some implementations, process 1000 may determine an angle that is twice the angle of the movement direction based on the cross-covariance vector X(k). For example, in some implementations, the angle that is twice the angle of the movement direction, generally represented herein as β, may be determined by:
Similar to what is described above, in the equation above, the subscript index represents the element extracted from the cross-covariance vector X(k).
In some implementations, given the angle β, process 1000 may determine the movement direction, generally represented herein as αmove, by dividing the angle β by 2. Given that this may lead to an ambiguity of the direction of αmove as either pointing in front of the user or pointing behind the user, αmove may be selected as the angle that is half the value of β that is closest to the vector associated with the head orientation direction of the user. Note that, as described above in connection with
Alternatively, in some implementations, the value of β may be modified based on the head orientation direction (generally referred to, for a given sample k, as αlook(k)). For example, the modified value of β, generally referred to herein as βdiff, may be determined by:
The modified value of β may then be used to determine αwalk. For example, αmove may be determined by:
In the equation given above, “round” represents a rounding to a nearest integer operation.
According to some alternative implementations the apparatus 1200 may be, or may include, a server. In some such examples, the apparatus 1200 may be, or may include, an encoder. Accordingly, in some instances the apparatus 1200 may be a device that is configured for use within an audio environment, such as a home audio environment, whereas in other instances the apparatus 1200 may be a device that is configured for use in “the cloud,” e.g., a server.
In this example, the apparatus 1200 includes an interface system 1205 and a control system 1210. The interface system 1205 may, in some implementations, be configured for communication with one or more other devices of an audio environment. The audio environment may, in some examples, be a home audio environment. In other examples, the audio environment may be another type of environment, such as an office environment, an automobile environment, a train environment, a street or sidewalk environment, a park environment, etc. The interface system 1205 may, in some implementations, be configured for exchanging control information and associated data with audio devices of the audio environment. The control information and associated data may, in some examples, pertain to one or more software applications that the apparatus 1200 is executing.
The interface system 1205 may, in some implementations, be configured for receiving, or for providing, a content stream. The content stream may include audio data. The audio data may include, but may not be limited to, audio signals. In some instances, the audio data may include spatial data, such as channel data and/or spatial metadata. In some examples, the content stream may include video data and audio data corresponding to the video data.
The interface system 1205 may include one or more network interfaces and/or one or more external device interfaces (such as one or more universal serial bus (USB) interfaces). According to some implementations, the interface system 1205 may include one or more wireless interfaces. The interface system 1205 may include one or more devices for implementing a user interface, such as one or more microphones, one or more speakers, a display system, a touch sensor system and/or a gesture sensor system. In some examples, the interface system 1205 may include one or more interfaces between the control system 1210 and a memory system, such as the optional memory system 1215 shown in
The control system 1210 may, for example, include a general purpose single- or multi-chip processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, and/or discrete hardware components.
In some implementations, the control system 1210 may reside in more than one device. For example, in some implementations a portion of the control system 1210 may reside in a device within one of the environments depicted herein and another portion of the control system 1210 may reside in a device that is outside the environment, such as a server, a mobile device (e.g., a smartphone or a tablet computer), etc. In other examples, a portion of the control system 1210 may reside in a device within one environment and another portion of the control system 1210 may reside in one or more other devices of the environment. For example, a portion of the control system 1210 may reside in a device that is implementing a cloud-based service, such as a server, and another portion of the control system 1210 may reside in another device that is implementing the cloud-based service, such as another server, a memory device, etc. The interface system 1205 also may, in some examples, reside in more than one device.
In some implementations, the control system 1210 may be configured for performing, at least in part, the methods disclosed herein. According to some examples, the control system 1210 may be configured for implementing methods of determining a movement direction, determining a movement direction based on a direction orthogonal to the movement direction, or the like.
Some or all of the methods described herein may be performed by one or more devices according to instructions (e.g., software) stored on one or more non-transitory media. Such non-transitory media may include memory devices such as those described herein, including but not limited to random access memory (RAM) devices, read-only memory (ROM) devices, etc. The one or more non-transitory media may, for example, reside in the optional memory system 1215 shown in
In some examples, the apparatus 1200 may include the optional microphone system 1220 shown in
According to some implementations, the apparatus 1200 may include the optional loudspeaker system 1225 shown in
Some aspects of present disclosure include a system or device configured (e.g., programmed) to perform one or more examples of the disclosed methods, and a tangible computer readable medium (e.g., a disc) which stores code for implementing one or more examples of the disclosed methods or steps thereof. For example, some disclosed systems can be or include a programmable general purpose processor, digital signal processor, or microprocessor, programmed with software or firmware and/or otherwise configured to perform any of a variety of operations on data, including an embodiment of disclosed methods or steps thereof. Such a general purpose processor may be or include a computer system including an input device, a memory, and a processing subsystem that is programmed (and/or otherwise configured) to perform one or more examples of the disclosed methods (or steps thereof) in response to data asserted thereto.
Some embodiments may be implemented as a configurable (e.g., programmable) digital signal processor (DSP) that is configured (e.g., programmed and otherwise configured) to perform required processing on audio signal(s), including performance of one or more examples of the disclosed methods. Alternatively, embodiments of the disclosed systems (or elements thereof) may be implemented as a general purpose processor (e.g., a personal computer (PC) or other computer system or microprocessor, which may include an input device and a memory) which is programmed with software or firmware and/or otherwise configured to perform any of a variety of operations including one or more examples of the disclosed methods. Alternatively, elements of some embodiments of the inventive system are implemented as a general purpose processor or DSP configured (e.g., programmed) to perform one or more examples of the disclosed methods, and the system also includes other elements (e.g., one or more loudspeakers and/or one or more microphones). A general purpose processor configured to perform one or more examples of the disclosed methods may be coupled to an input device (e.g., a mouse and/or a keyboard), a memory, and a display device.
Another aspect of present disclosure is a computer readable medium (for example, a disc or other tangible storage medium) which stores code for performing (e.g., coder executable to perform) one or more examples of the disclosed methods or steps thereof.
While specific embodiments of the present disclosure and applications of the disclosure have been described herein, it will be apparent to those of ordinary skill in the art that many variations on the embodiments and applications described herein are possible without departing from the scope of the disclosure described and claimed herein. It should be understood that while certain forms of the disclosure have been shown and described, the disclosure is not to be limited to the specific embodiments described and shown or the specific methods described.
This application claims priority to U.S. Provisional Application No. 63/293,444 filed Dec. 23, 2021, and U.S. Provisional Application No. 63/376,347 filed on Sep. 20, 2022, each of which is incorporated by reference in its entirety.
Filing Document | Filing Date | Country | Kind |
---|---|---|---|
PCT/US22/53826 | 12/22/2022 | WO |
Number | Date | Country | |
---|---|---|---|
63376347 | Sep 2022 | US | |
63293444 | Dec 2021 | US |