Computing systems in communication with multiple display devices allow users to view application programs and digital content across a broader display area. While such setups are a convenient platform for viewing visual data in a larger format, coordinating the display devices to cooperatively display the visual data can be challenging in several ways. Upon initial setup of a computing system that includes more than one display device, the display devices may be randomly oriented, and the computing system may not know the positions and/or orientations of the display devices. When one or more of the display devices is moved, the display of the visual data may become discontinuous or out of sequence. When a new display device is added to the computing system, the computing system may lack information about the position of the new display device, resulting in an inability to include the new display device in the display of visual data. When a user desires to share visual data from one display device to another, multiple nearby display devices may be identified, increasing the risk of inadvertently sharing sensitive data. Such inability of the computing system to recognize the position of each display device and logically display various content of the visual data across the multiple display devices may require frequent updating of the positions of each display device by the user, resulting in interrupted tasks and frustration for the user.
To address the above issues, a computing system is described herein that includes a processor, a primary display device, and a secondary display device. The primary display device may be operatively coupled to the processor and configured to transmit a first signal. The secondary display device may be operatively coupled to the processor and configured to transmit a second signal. The processor may be configured to execute an ultrasonic discovery protocol included in a memory. The ultrasonic discovery protocol may be programmatically executed upon detection of a positional trigger event. Execution of the ultrasonic discovery protocol by the processor may cause the primary display device to transmit the first signal. The first signal may be an acoustic signal that is received by the secondary display device via a microphone array. In response to receiving the first signal, the secondary display device may transmit the second signal to the primary display device. The second signal may encode data that indicates a positional relationship between the primary display device and the secondary display device. Based on the indicated positional relationship, the primary and secondary display devices may be configured to cooperatively display the visual data.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter. Furthermore, the claimed subject matter is not limited to implementations that solve any or all disadvantages noted in any part of this disclosure.
The inventors of the subject application have discovered that coordinating multiple display devices to cooperatively display visual data is constrained by the lack of ability of conventional systems to programmatically determine the position of each display device in an array. In a typical configuration of a computing system in communication with multiple display devices, a user manually assigns a position to each display device. For example, in a computing system with three display devices, the user may designate a central display device as a first display device, a display device to the right of the first display device as the second display device, and a display device to the left of the first display device as the third display device. When the orientation of these display devices is changed, the display of visual data may be disrupted or presented in an unintuitive arrangement, requiring the user to intervene to update the positions of the display devices. In some scenarios, the user may desire to share visual data from a first display device to a second display device by “flicking” the visual data to the second display device. The user input of “flicking” may trigger the first display device to ping nearby computing devices, often resulting in a list of several computing devices that requires a selection by the user.
As schematically illustrated in
The processor 12 may programmatically designate the primary and secondary display devices 16, 18 based on proximity to the processor 12, for example. However, it will be appreciated that the designation of the display devices as the primary display devices 16 and the secondary display device 18 may alternatively be determined by a user in a settings preference module 20 executed by the processor 12. In addition to being operatively coupled to the processor 12, the primary and secondary display devices 16, 18 may be on a network N with one another as indicated in
As shown in
To determine the number and orientations of display devices associated with the computing system 10, the processor 12 may be configured to execute an ultrasonic discovery protocol 30 via a program stored in non-volatile memory and executed by a processor of the computing system 10. The ultrasonic discovery protocol 30 may be programmatically executed upon detection of a positional trigger event TE. As discussed in detail below, the positional trigger event TE may be any one of several events, such as powering on of a device, user input, recognition of a new display device in the plurality of display devices, and movement of a display device having an established positional relationship with another display device. The positional trigger event TE may be detected by a positional trigger detector 32 included in the ultrasonic delivery protocol 30.
Execution of the ultrasonic discovery protocol 30 by the processor 12 may activate a signal transmission module 34 included in the ultrasonic discovery protocol 30, and cause the primary display device 16 to transmit a first signal S1. The first signal S1 may be an acoustic signal emitted at an ultrasonic frequency by the first speaker 24A of the primary display device 16. A key property of ultrasonic frequencies, or ultrasound, is that the sound waves are absorbed by soft surfaces and reflected by hard surfaces, such as wall. Thus, it will be appreciated that the first signal S1 may be set to a frequency and emitted at an amplitude that is ineffective for transmitting the first signal S1 through building walls. Specifically, the frequency of the first signal S1 may be at a frequency greater than 20 kHz, and preferably in a range of 20 kHz to 80 kHz. This feature has the beneficial effect of limiting the designation of the secondary display device 18 to display devices within a predetermined range of the first signal S1, thereby avoiding confusion among selectable display devices and decreasing the possibility of unintentionally disclosing sensitive or confidential data.
The first signal S1 may be received via the second microphone array 26B of the secondary display device 18. In response to receiving the first signal S1, the secondary display device 18 may transmit a second signal S2 to the primary display device 16. The second signal S2 may encode data that indicates a positional relationship between the primary display device 16 and the secondary display device 18. As discussed above, the secondary display device 18 may be equipped with the second speaker 24B and thus configured to transmit the second signal S2 acoustically. Additionally or alternatively, the secondary display device 18 may be connected to the primary display device 16 in a hardwired configuration, thereby permitting the second signal S2 to be transmitted electrically or acoustically.
An orientation calculation module 36 included in the ultrasonic discovery protocol 30 may process the data encoded in the second signal S2 that indicates a positional relationship between the primary and secondary display devices 16, 18 to determine the orientation of the secondary display device 16 relative to the position of the primary display device 18. The orientation calculation module 36 may be in communication with the processor 12 and a visual data display module 38 included in the ultrasonic discovery protocol 30. Upon receiving information about the positional relationship between the primary and secondary display devices 16, 18, the visual data display module 38 may provide instructions to the processor 12 to command the primary and secondary display devices 16, 18 to cooperatively display the visual data VD based on the indicated positional relationship of the primary and secondary display devices 16, 18.
An example use-case scenario of the computing system 10 of
When the processor 12 executes the ultrasonic discovery protocol 30, the signal transmission module 34 of the ultrasonic discovery protocol 30 may instruct the primary display device 16 to emit the first signal S1 from the first speaker 24A, as shown in
Additionally, while the first and second microphone arrays 26A, 26B may be conventionally enabled to measure sound pressure, each microphone included in the first and second microphone arrays 26A, 26B may be additionally equipped with a polar pattern to further distinguish a direction of a received acoustic signal. The resulting data may determine a direction of the secondary display device 18 in relation to the primary display device 16. With this data and the TDOA between the microphones in the first stereoscopic microphone array 26A, the orientation calculation module 36 of the ultrasonic discovery protocol 30 can determine the position and orientation of the secondary display device 18, thereby enabling the visual data display module 38 to coordinate the display of visual data VD across the primary and secondary display devices 16, 18.
In some scenarios, ambient noise or other ultrasonic signals may result in the inability of the computing system 10 to distinguish the first and/or second signal S1, S2. In such cases, the signal transmission module 34 of the ultrasonic discovery protocol 30 may be configured to instruct the primary and/or secondary display device 16, 18 to emit the first and/or second signal S1, S2, respectively, at an alternative ultrasonic frequency or rate of occurrence to overcome any ambiguities in the identification of the orientation of either the primary or secondary display devices 16, 18.
An example use-case scenario of the computing system 10 of
In response to receiving the first signal S1, the secondary display device 18 may transmit the second signal S2. When the primary and secondary display devices 16, 18 are in hardwired communication on the network N, the second signal S2 may be an electric signal transmitted by the secondary display device 18, as shown in
In addition to the trigger events TE described above, the processor 12 may be configured to execute the ultrasonic discovery protocol 30 when movement is detected in at least one of the display devices in the array.
As discussed above in reference to
When detected, the movement of the primary or secondary display devices 16, 18 may cause an increase in the frequency of execution of the ultrasonic discovery protocol 30. As discussed above, the processor 12 may be configured to programmatically execute the ultrasonic discovery protocol 30 in response to the detection of one of the described positional trigger events TE. Typically, the ultrasonic discovery protocol 30 may be executed periodically to determine the positional relationship between the primary and secondary display devices 16, 18 and detect any changes. However, when the positional trigger event TE is movement of one of the primary or secondary display devices 16, 18, the processor 12 may be configured to execute the ultrasonic discovery protocol 30 repeatedly until it is determined that the display device in motion has come to rest.
For example, as shown in
While the example illustrated in
In any of the embodiments described herein, a display mode for displaying the visual data VD may be defined on the basis of the positional relationship of the primary and secondary display devices 16, 18. In some implementations, the primary and secondary display devices 16, 18 may be configured to display the visual data VD as a single image across the first and second displays 22A, 22B, as shown in
In some implementations, it may be desirable to transfer visual data VD from the primary display device 16 to the secondary display device 18. For example, as shown in
Upon recognition of the positional trigger event TE, the processor 12 may execute the ultrasonic discovery protocol 30. As the primary display device 16 emits the first signal S1, the ultrasonic discovery protocol 30 may be configured to identify a display device in closest proximity to the primary display device 16 as the secondary display device 18. As described above with reference to
While the implementation described with reference to
While the computing system 10 described above includes the primary display device 16 and the secondary display device 18, it will be appreciated that the plurality of display devices included in the computing system 10 is not limited to any particular quantity. In any of the implementations described herein, the computing device 10 may be configured to include one or more displays in addition to the primary display device 16 and the secondary display device 18. For example, as shown in
Additionally or alternatively, when the secondary display device 16 is configured as a slave device, the primary display device 18 may utilize components of the slaved secondary device, such as transducers and/or microphone arrays, to determine the relative positions of other display devices associated with the computing system 10. This configuration may supplement information generated by the primary display device 16 to increase accuracy (i.e., a supplemental point-of-view), or provide positional information for display devices that are not directly detectable by the primary display device 16 during execution of the ultrasonic discovery protocol 30.
In the example use-case scenario shown in
In the example illustrated in
The positional relationship of the primary and secondary display devices 16, 18, as well as any other display devices included in the computing system 10, may be defined by a grid template 44, as shown in
Further, in any of the implementations described herein, a display device may be required to be within a predetermined threshold distance T of other display devices in the array. When a display device enters the limitation of the threshold distance T, it may be joined into the array of display devices. As described above, the recognition of a new display device in the plurality of display devices is a positional trigger event TE that causes the execution of the ultrasonic discovery protocol 30 to determine the position of the display device. Also as described above, the movement of a display device having an established positional relationship with another display device is a positional trigger event TE that causes the execution of the ultrasonic discovery protocol 30 to determine the position of the display device. When the display device moves outside of the predetermined threshold distance T of the array, the display device may be disconnected from the array.
The threshold distance T may be configured according to direction. For example, as shown in
In any of the above embodiments, it will be appreciated that the relative orientation of the displays may be taken into account in addition to the relative position, such that displays positioned proximate to each other, but facing in opposite or nearly opposite directions (thus not being visible from a same vantage point), are not paired together in a display array for cooperative display according to a pairing logic of the computing system 10. The orientation of each display may be detected by mounting ultrasonic emitters on each side (i.e., front and back) of a display to create a three-dimensional microphone array, and detecting the relative difference in sound received from a front-mounted emitter and a rear mounted emitter.
Additionally, as shown in
A direction angle DA of the sound source SS may be measured with a stereoscopic microphone array by computing a time delay T at which the sound is received at the far microphone FM after the sound is received at the near microphone NM, in combination with the known speed of the sound V and the distance D between the near and far microphones NM, FM by applying the equation:
DA=arcsin(TV/D)
Additionally or alternatively, more than two microphones may be included in the array, such as the three-dimensional microphone array described above, and the location L of the sound source SS may be determined by triangulation to calculate vectors in three dimensions. The evaluation of multiple angles may maximize the accuracy of determining the location L of the sound source SS. Algorithms including criteria such as strength of the sound signal, spatial probability, and know locations of included components may be applied to estimate a confidence level of the location L of the sound source. If the confidence level is below a predetermined threshold, execution of the ultrasonic discovery protocol 30 may be repeated.
At step 902, the method 900 may include configuring the processor to execute an ultrasonic discovery protocol included in the associated memory. As described above, the ultrasonic discovery protocol may determine a positional relationship between display devices included in the computing system 10 such that visual data may be cooperatively displayed across the display devices.
Advancing to step 904, the method 900 may include operatively coupling a primary display device to the processor, the primary display device being configured to transmit a first signal. Continuing from step 904 to step 906, the method 900 may include operatively coupling a secondary display device to the processor, the secondary display device being configured to transmit a second signal. In addition to being operatively coupled to the processor, the primary and secondary display devices may be in communication with one another. In some implementations, this communication may occur wirelessly, via BLUETOOTH technology or the like. Additionally or alternatively, the primary display device may be connected to the secondary display device via a wired connection.
Proceeding from step 906 to step 908, the method 900 may further include detecting a positional trigger event. As described above, the positional trigger event may be any one of several events, such as powering on of a device, user input, recognition of a new display device in the plurality of display devices, and movement of a display device having an established positional relationship with another display device. The positional trigger event TE may be detected by a positional trigger detector included in the ultrasonic discovery protocol.
Advancing from step 908 to step 910, the method 900 may include executing the ultrasonic discovery protocol. As described above, execution of the ultrasonic discovery protocol by the processor may activate a signal transmission module included in the ultrasonic discovery protocol, and cause the primary display device to transmit a first signal. Accordingly, continuing from step 910 to step 912, the method 900 may include transmitting, by the primary display device. The first signal may be an acoustic signal emitted by the first speaker of the primary display device 16.
Proceeding from step 912 to step 914, the method 900 may further include receiving, by a microphone array of the secondary display device, the first signal. In response to receiving the first signal, at step 916 the method 900 may include transmitting, by the secondary display device to the primary display device, the second signal. As described above, the second signal may encode data that indicates a positional relationship between the primary display device and the secondary display device. As discussed above, the secondary display device may be equipped with the second speaker and thus configured to transmit the second signal acoustically. Additionally or alternatively, the method may further include connecting the primary display device to the secondary display device via a wired connection such that the second signal can be transmitted electrically or acoustically.
Advancing from step 916 to step 918, the method 900 may include cooperatively displaying the visual data on the primary and secondary display devices based on the indicated positional relationship. As described above, an orientation calculation module included in the ultrasonic discovery protocol may process the data encoded in the second signal that indicates a positional relationship between the primary and secondary display devices to determine the orientation of the secondary display device relative to the position of the primary display device. The orientation calculation module may be in communication with the processor and a visual data display module included in the ultrasonic discovery protocol. Upon receiving information about the positional relationship between the primary and secondary display devices the visual data display module may provide instructions to the processor to command the primary and secondary display devices to cooperatively display the visual data VD based on the indicated positional relationship of the primary and secondary display devices. As described above, the method may further include defining a display mode for displaying the visual data based on the positional relationship of the primary and secondary display devices, and the positional relationship may be defined by a grid template.
In some embodiments, the methods and processes described herein may be tied to a computing system of one or more computing devices. In particular, such methods and processes may be implemented as a computer-application program or service, an application-programming interface (API), a library, and/or other computer-program product.
Computing system 1000 includes a logic processor 1002 volatile memory 1003, and a non-volatile storage device 1004. Computing system 1000 may optionally include a display subsystem 1006, input subsystem 1008, communication subsystem 1010, and/or other components not shown in
Logic processor 1002 includes one or more physical devices configured to execute instructions. For example, the logic processor may be configured to execute instructions that are part of one or more applications, programs, routines, libraries, objects, components, data structures, or other logical constructs. Such instructions may be implemented to perform a task, implement a data type, transform the state of one or more components, achieve a technical effect, or otherwise arrive at a desired result.
The logic processor may include one or more physical processors (hardware) configured to execute software instructions. Additionally or alternatively, the logic processor may include one or more hardware logic circuits or firmware devices configured to execute hardware-implemented logic or firmware instructions. Processors of the logic processor 1002 may be single-core or multi-core, and the instructions executed thereon may be configured for sequential, parallel, and/or distributed processing. Individual components of the logic processor optionally may be distributed among two or more separate devices, which may be remotely located and/or configured for coordinated processing. Aspects of the logic processor may be virtualized and executed by remotely accessible, networked computing devices configured in a cloud-computing configuration. In such a case, these virtualized aspects are run on different physical logic processors of various different machines, it will be understood.
Non-volatile storage device 1004 includes one or more physical devices configured to hold instructions executable by the logic processors to implement the methods and processes described herein. When such methods and processes are implemented, the state of non-volatile storage device 1004 may be transformed—e.g., to hold different data.
Non-volatile storage device 1004 may include physical devices that are removable and/or built-in. Non-volatile storage device 1004 may include optical memory (e.g., CD, DVD, HD-DVD, Blu-Ray Disc, etc.), semiconductor memory (e.g., ROM, EPROM, EEPROM, FLASH memory, etc.), and/or magnetic memory (e.g., hard-disk drive, floppy-disk drive, tape drive, MRAM, etc.), or other mass storage device technology. Non-volatile storage device 1004 may include nonvolatile, dynamic, static, read/write, read-only, sequential-access, location-addressable, file-addressable, and/or content-addressable devices. It will be appreciated that non-volatile storage device 1004 is configured to hold instructions even when power is cut to the non-volatile storage device 1004.
Volatile memory 1003 may include physical devices that include random access memory. Volatile memory 1003 is typically utilized by logic processor 1002 to temporarily store information during processing of software instructions. It will be appreciated that volatile memory 1003 typically does not continue to store instructions when power is cut to the volatile memory 1003.
Aspects of logic processor 1002, volatile memory 1003, and non-volatile storage device 1004 may be integrated together into one or more hardware-logic components. Such hardware-logic components may include field-programmable gate arrays (FPGAs), program- and application-specific integrated circuits (PASIC/ASICs), program- and application-specific standard products (PSSP/ASSPs), system-on-a-chip (SOC), and complex programmable logic devices (CPLDs), for example.
The terms “module,” “program,” and “engine” may be used to describe an aspect of computing system 1000 typically implemented in software by a processor to perform a particular function using portions of volatile memory, which function involves transformative processing that specially configures the processor to perform the function. Thus, a module, program, or engine may be instantiated via logic processor 1002 executing instructions held by non-volatile storage device 1004, using portions of volatile memory 1003. It will be understood that different modules, programs, and/or engines may be instantiated from the same application, service, code block, object, library, routine, API, function, etc. Likewise, the same module, program, and/or engine may be instantiated by different applications, services, code blocks, objects, routines, APIs, functions, etc. The terms “module,” “program,” and “engine” may encompass individual or groups of executable files, data files, libraries, drivers, scripts, database records, etc.
When included, display subsystem 1006 may be used to present a visual representation of data held by non-volatile storage device 1004. The visual representation may take the form of a graphical user interface (GUI). As the herein described methods and processes change the data held by the non-volatile storage device, and thus transform the state of the non-volatile storage device, the state of display subsystem 1006 may likewise be transformed to visually represent changes in the underlying data. Display subsystem 1006 may include one or more display devices utilizing virtually any type of technology. Such display devices may be combined with logic processor 1002, volatile memory 1003, and/or non-volatile storage device 1004 in a shared enclosure, or such display devices may be peripheral display devices.
When included, input subsystem 1008 may comprise or interface with one or more user-input devices such as a keyboard, mouse, touch screen, or game controller. In some embodiments, the input subsystem may comprise or interface with selected natural user input (NUI) componentry. Such componentry may be integrated or peripheral, and the transduction and/or processing of input actions may be handled on- or off-board. Example NUI componentry may include a microphone for speech and/or voice recognition; an infrared, color, stereoscopic, and/or depth camera for machine vision and/or gesture recognition; a head tracker, eye tracker, accelerometer, and/or gyroscope for motion detection and/or intent recognition; as well as electric-field sensing componentry for assessing brain activity; and/or any other suitable sensor.
When included, communication subsystem 1010 may be configured to communicatively couple various computing devices described herein with each other, and with other devices. Communication subsystem 1010 may include wired and/or wireless communication devices compatible with one or more different communication protocols. As non-limiting examples, the communication subsystem may be configured for communication via a wireless telephone network, or a wired or wireless local- or wide-area network, such as a HDMI over Wi-Fi connection. In some embodiments, the communication subsystem may allow computing system 1000 to send and/or receive messages to and/or from other devices via a network such as the Internet.
The following paragraphs provide additional support for the claims of the subject application. One aspect provides a computing system capable of displaying visual data over a plurality of display devices. The computing system may comprise a processor, a primary display device, and a secondary display device. The processor may be configured to execute an ultrasonic discovery protocol. The primary display device may be operatively coupled to the processor and configured to transmit a first signal. The secondary display device may be operatively coupled to the processor and configured to transmit a second signal. The ultrasonic discovery protocol may be programmatically executed upon detection of a positional trigger event. Execution of the ultrasonic discovery protocol by the processor may cause the primary display device to transmit the first signal. The first signal may be an acoustic signal received via a microphone array of the secondary display device. In response to receiving the first signal, the secondary display device may transmit the second signal to the primary display device. The second signal may encode data that indicates a positional relationship between the primary display device and the secondary display device. Based on the indicated positional relationship, the primary and secondary display devices may be configured to cooperatively display the visual data.
In this aspect, additionally or alternatively, the positional trigger event may be one of powering on of a device, user input, recognition of a new display device in the plurality of display devices, and movement of a display device having an established positional relationship with another display device. In this aspect, additionally or alternatively, the movement of a display device may be detected by one of an inertial motion unit, a microphone array, and a change in time of arrival of a transmitted signal. In this aspect, additionally or alternatively, the movement of a display device causes an increase in the frequency of execution of the ultrasonic discovery protocol.
In this aspect, additionally or alternatively, the primary display device may include a speaker and a microphone array. In this aspect, additionally or alternatively, the microphone array of the second device may be a stereoscopic microphone array. In this aspect, additionally or alternatively, the second signal may be transmitted electrically or acoustically.
In this aspect, additionally or alternatively, the primary display device may be a master display device including the processor, and the secondary display device may be a slave device. In this aspect, additionally or alternatively, the computing system may further comprise a third display device configured to transmit a third signal, and a fourth display device configured to transmit a fourth signal. In this aspect, additionally or alternatively, the positional relationship of the primary and secondary display devices may be defined by a grid template. In this aspect, additionally or alternatively, the ultrasonic discovery protocol may be configured to identify a display device in closest proximity to the primary display device as the secondary display device. In this aspect, additionally or alternatively, the primary display device may be connected to the secondary display device via a wired connection.
In this aspect, additionally or alternatively, a display mode for displaying the visual data may be defined on a basis of the positional relationship of the primary and secondary display devices. In this aspect, additionally or alternatively, the processor may be configured to transmit the positional relationship of display devices in the plurality of display devices to each display device. In this aspect, additionally or alternatively, the first signal may be set to a frequency and emitted at an amplitude that is ineffective for transmitting the first signal through building walls.
Another aspect provides a method for displaying visual data over a plurality of display devices. The method may comprise configuring a processor to execute an ultrasonic discovery protocol and operatively coupling a primary display device and a secondary display device to the processor, the primary display device being configured to transmit a first signal and the secondary display device being configured to transmit a second signal. The method may further include detecting a positional trigger event, executing the ultrasonic discovery protocol, and transmitting, by the primary display device, the first signal, the first signal being an acoustic signal. The method may further include receiving, by a microphone array of the secondary display device, the first signal, and in response to receiving the first signal, transmitting, by the secondary display device to the primary display device, the second signal, the second signal encoding data that indicates a positional relationship between the primary display device and the secondary display device. The method may further include cooperatively displaying the visual data on the primary and secondary display devices based on the indicated positional relationship.
In this aspect, additionally or alternatively, the method may further comprise defining a display mode for displaying the visual data based on the positional relationship of the primary and secondary display devices. In this aspect, additionally or alternatively, method may further comprise defining the positional relationship of the primary and secondary display devices by a grid template. In this aspect, additionally or alternatively, the method may further comprise connecting the primary display device to the secondary display device via a wired connection such that the second signal can be transmitted electrically or acoustically.
It will be understood that the configurations and/or approaches described herein are exemplary in nature, and that these specific embodiments or examples are not to be considered in a limiting sense, because numerous variations are possible. The specific routines or methods described herein may represent one or more of any number of processing strategies. As such, various acts illustrated and/or described may be performed in the sequence illustrated and/or described, in other sequences, in parallel, or omitted. Likewise, the order of the above-described processes may be changed.
The subject matter of the present disclosure includes all novel and non-obvious combinations and sub-combinations of the various processes, systems and configurations, and other features, functions, acts, and/or properties disclosed herein, as well as any and all equivalents thereof.