Technical Field
The present systems, methods, and computer program products generally relate to manipulating electronically displayed content and particularly relate to controllably magnifying or “zooming-in on” electronically displayed presentation materials.
Description of the Related Art
Electronic displays come in many different forms, including without limitation: flat panel displays such as liquid crystal displays (“LCDs”) and plasma displays, cathode ray tube displays, projection displays, and so on. The content displayed on an electronic display may include still images, text, animations, and/or video. In a specific application, the content displayed on an electronic display may include materials to supplement a presentation given by one or more orators (i.e., “electronically displayed presentation materials”).
A person of skill in the art will be familiar with many different software applications (hereafter “presentation software”) that enable a user (e.g., a presenter) to electronically display and navigate through presentation materials. Popular examples of presentation software include Microsoft PowerPoint®, Google Slides®, and Keynote® by Apple Inc. (“PowerPoint et al.”), all of which enable the presenter to enhance their presentation through the electronic display of “presentation slides.” As the user presents, he/she is able to navigate from one slide to the next while displaying the slides to an audience on one or more electronic display(s). This is a very well-established presentation format, so much so that it is becoming a bit stale. Most presentation software available today only permits the user to interact with slides by navigating forwards and backwards between slides. Recently, new presentation software called Prezi™ has been introduced which abandons the slide concept and instead displays presentation materials in a parallax three-dimensional virtual space. The presenter can navigate the virtual space by, for example, turning, rotating, and/or zooming in/out, all on display for the audience. The enhanced interactivity and navigability introduced by Prezi™ facilitates more dynamic and unconventional presentation styles, allowing presenters to break from the norm established by PowerPoint® et al. and, arguably, giving rise to more interesting presentations. However, PowerPoint® et al. are thoroughly ingrained in personal and business computing environments today and the majority of electronically displayed presentation materials continue to use the “slide” format afforded by these applications. There is a need in the art for enhancements and adaptations to existing presentation software (e.g., PowerPoint® et al.) that introduce new, more dynamic ways of interacting with typical presentation slides.
A method of operation in a display system which comprises at least one processor, an electronic display communicatively coupled to the at least one processor, and at least one non-transitory processor-readable storage medium communicatively coupled to the at least one processor, wherein the at least one non-transitory processor-readable storage medium stores at least one of processor-executable instructions and data, may be summarized as including: causing, by the at least one processor, a display of content at a first magnification level on the electronic display; receiving, by the at least one processor, a user input indicative of a magnification setting command; in response to the user input indicative of the magnification setting command: capturing, by the at least one processor, a digital copy image of the content; digitally magnifying, by the at least one processor, at least a portion of the digital copy image of the content; and causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display, the second magnification level greater than the first magnification level. The method may further include causing, by the at least one processor, an overlay of a borderless window that is transparent to both content and a majority of events on at least a portion of the display of content on the electronic display, wherein causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display includes causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at the second magnification level in the borderless window on the electronic display.
Capturing, by the at least one processor, a digital copy image of the content may include capturing, by the at least one processor, a screenshot of the content. Digitally magnifying, by the at least one processor, at least a portion of the digital copy image of the content may include producing, by the at least one processor, a series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels; and causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display may include causing, by the at least one processor, a sequential display of the series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels. Causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content on the electronic display at a second magnification level may include causing, by the at least one processor, the digitally magnified at least a portion of the digital copy image of the content to overlay at least a portion of the content on the electronic display.
The method may further include: receiving, by the at least one processor, a user input indicative of a display restoration command; and in response to the user input indicative of the display restoration command: stopping, by the at least one processor, the display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display. Causing, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display may include causing, by the at least one processor, the digitally magnified at least a portion of the digital copy image of the content to completely overlay the content and stopping by the at least one processor, the display of the content at the first magnification level on the electronic display, and the method may further include: in response to the user input indicative of the display restoration command: causing, by the at least one processor, a resumption of the display of the content at the first magnification level on the electronic display. Digitally magnifying, by the at least one processor, at least a portion of the digital copy image of the content may include producing, by the at least one processor, a series of digital copy images of respective portions of the content at successively greater magnification levels; and stopping, by the at least one processor, a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display may include causing, by the at least one processor, sequential display of the series of digital copy images of respective portions of the content at successively lesser magnification levels.
The display system may include a portable control device, and the method may further include: detecting, by the portable control device, the user input indicative of the magnification setting command; in response to detecting, by the portable control device, the user input indicative of the magnification command, transmitting a first signal by the portable control device; and receiving the first signal by the at least one processor. The portable control device may include a gesture-based control device and wherein detecting, by the portable control device, the user input indicative of the magnification setting command may include detecting, by the gesture-based control device, a first physical gesture performed by a user of the display system.
The method may further include: receiving, by the at least one processor, a user input indicative of a pointer command; in response to the user input indicative of the pointer command: causing, by the at least one processor, a display of a dynamic cursor over a portion of the content on the electronic display, and wherein: digitally magnifying, by the at least one processor, at least a portion of the digital copy image of the content includes digitally magnifying, by the at least one processor, at least the portion of the content over which the dynamic cursor is displayed. The method may include causing, by the at least one processor, an overlay of a borderless window that is transparent to both content and a majority of events on at least a portion of the display of the content on the electronic display, and causing, by the at least one processor, a display of a dynamic cursor over a portion of the content on the electronic display may include causing, by the at least one processor, a display of the dynamic cursor in the borderless window over top of a portion of the content on the electronic display.
A display system may be summarized as including: an electronic display; at least one processor communicatively coupled to the electronic display; and at least one non-transitory processor-readable storage medium communicatively coupled to the at least one processor, wherein the at least one non-transitory processor-readable storage medium stores at least one of processor-executable instructions and data that, when executed by the at least one processor, cause the display system to: display a content at a first magnification level on the electronic display; and in response to a user input indicative of a magnification setting command: capture a digital copy image of the content; digitally magnify at least a portion of the digital copy image of the content; and display the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display, the second magnification level greater than the first magnification level. When executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to: overlay a borderless window that is transparent to both content and a majority of events on at least a portion of the display of the content on the electronic display, and display the digitally magnified at least a portion of the digital copy image of the content at the second magnification level in the borderless window on the electronic display. In response to a user input indicative of a magnification setting command, when executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to: digitally magnify at least a portion of the digital copy image of the content cause the display system to produce a series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels, and sequentially display the series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels.
In response to a user input indicative of a magnification setting command, when executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to display the digitally magnified at least a portion of the digital copy image of the content to overlay at least a portion of the content on the electronic display. When executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to: in response to a user input indicative of a display restoration command: stop displaying the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display.
The display system may further include: a portable control device responsive to at least the user input indicative of the magnification setting command, wherein in response to the user input indicative of the magnification setting command the portable control device transmits at least one signal to the at least one processor. The portable control device may include a gesture-based control device responsive to at least a first physical gesture performed by a user of the display system.
When executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to: in response to a user input indicative of a pointer command: display a dynamic cursor over a portion of the content on the electronic display, and in response to a user input indicative of a magnification setting command: digitally magnify at least the portion of the content over which the dynamic cursor is displayed. When executed by the at least one processor, the at least one of processor-executable instructions and data may cause the display system to: overlay a borderless window that is transparent to both content and a majority of events on at least a portion of the display of the content on the electronic display, and display a dynamic cursor over a portion of the content on the electronic display cause the display system to display the dynamic cursor in the borderless window over top of a portion of the content on the electronic display.
A non-transitory processor-readable storage medium of a digital computer system may be summarized as including: at least one of processor-executable instructions and data that, when executed by at least one processor of the digital computer system, cause the digital computer system to: display a content at a first magnification level on an electronic display; and in response to a user input indicative of a magnification setting command: capture a digital copy image of the content; digitally magnify at least a portion of the digital copy image of the content; and display the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display, the second magnification level greater than the first magnification level.
The non-transitory processor-readable storage medium may further include at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to: overlay a borderless window that is transparent to both content and a majority of events on at least a portion of the display of the content on the electronic display, and wherein the at least one of processor-executable instructions and data that, when executed by the processor of the digital computer system, cause the digital computer system to display the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display cause the digital computer system to display the digitally magnified at least a portion of the digital copy image of the content at the second magnification level in the borderless window on the electronic display.
The at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a magnification setting command, digitally magnify at least a portion of the digital copy image of the content may cause the digital computer system to produce a series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels, and the at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a magnification setting command, display the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display may cause the digital computer system to sequentially display the series of digital copy images of respective portions of the content at successive (e.g., successively greater or successively lesser) magnification levels on the electronic display.
The at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a magnification setting command, display the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display may cause the digitally magnified at least a portion of the digital copy image of the content to overlay at least a portion of the content on the electronic display.
The non-transitory processor-readable storage medium may further include at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a display restoration command: stop displaying the digitally magnified at least a portion of the digital copy image of the content at the second magnification level on the electronic display.
The non-transitory processor-readable storage medium may further include at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a pointer command: display a dynamic cursor over a portion of the content on the electronic display, and wherein the at least one of processor-executable instructions and data that, when executed by the processor of the digital computer system, cause the digital computer system to, in response to a user input indicative of a magnification setting command, digitally magnify at least a portion of the digital copy image of the content cause the display system to digitally magnify at least the portion of the content over which the dynamic cursor is displayed.
The non-transitory processor-readable storage medium may further include at least one of processor-executable instructions and data that, when executed by the at least one processor of the digital computer system, cause the digital computer system to: overlay a borderless window that is transparent to both content and a majority of events on at least a portion of the display of the content on the electronic display, and wherein the at least one of processor-executable instructions and data that, when executed by the processor of the digital computer system, cause the digital computer system to display a dynamic cursor over a portion of the content on the electronic display cause the digital computer system to display the dynamic cursor in the borderless window over top of a portion of the content on the electronic display.
In the drawings, identical reference numbers identify similar elements or acts. The sizes and relative positions of elements in the drawings are not necessarily drawn to scale. For example, the shapes of various elements and angles are not necessarily drawn to scale, and some of these elements are arbitrarily enlarged and positioned to improve drawing legibility. Further, the particular shapes of the elements as drawn are not necessarily intended to convey any information regarding the actual shape of the particular elements, and have been solely selected for ease of recognition in the drawings.
In the following description, certain specific details are set forth in order to provide a thorough understanding of various disclosed embodiments. However, one skilled in the relevant art will recognize that embodiments may be practiced without one or more of these specific details, or with other methods, components, materials, etc. In other instances, well-known structures associated with electronic devices, and in particular portable electronic devices such as wearable electronic devices, have not been shown or described in detail to avoid unnecessarily obscuring descriptions of the embodiments.
Unless the context requires otherwise, throughout the specification and claims which follow, the word “comprise” and variations thereof, such as, “comprises” and “comprising” are to be construed in an open, inclusive sense, that is as “including, but not limited to.”
Reference throughout this specification to “one embodiment” or “an embodiment” means that a particular feature, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
As used in this specification and the appended claims, the singular forms “a,” “an,” and “the” include plural referents unless the content clearly dictates otherwise. It should also be noted that the term “or” is generally employed in its broadest sense, that is as meaning “and/or” unless the content clearly dictates otherwise.
The headings and Abstract of the Disclosure provided herein are for convenience only and do not interpret the scope or meaning of the embodiments.
The various embodiments described herein provide systems, methods, and computer program products to enable new user interactions with electronically displayed presentation materials. In particular, the present systems, methods, and computer program products introduce a magnification, or “zoom,” capability to presentation software that does not otherwise include such functionality.
A person of skill in the art will appreciate that portable control device 130 may be any of a wide-variety of control devices, including without limitation: a remote control, a presentation clicker, a presentation remote, or a wireless remote presenter, or the like. In alternative implementations, portable control device 130 may not be a wireless device and may instead be communicatively coupled through a wired connection to a tethered connector port (e.g., a Universal Serial Bus port) component of digital control system 120. In the illustrated embodiment, device 130 is a wearable gesture-based control device responsive to physical gestures performed by a user of display system 100. An example of a suitable gesture-based control device is the Myo™ armband available from Thalmic Labs Inc., and accordingly, the present systems, methods, and computer program products may incorporate or be adapted to work with the teachings in any or all of: US Patent Publication US 2014-0240103 A1, US Patent Publication US 2015-0057770 A1, US Patent Publication US 2015-0070270 A1, U.S. Non-Provisional patent application Ser. No. 14/658,552 (now US Patent Publication US 2015-0261306 A1), and/or U.S. Non-Provisional patent application Ser. No. 14/679,850 (now US Patent Publication US 2015-0296553 A1), each of which is incorporated herein by reference in its entirety.
The combination of electronic display 110 and digital control system 120 is hereinafter referred to as a “digital computer system.” For clarity, a digital computer system may include far more components than those illustrated in
The various embodiments described herein include, make use of, or generally relate to one or more computer program product(s) directly loadable into a non-transitory processor-readable storage medium of a digital computer system. Such a computer program product stores at least one of processor-executable instructions and data that, when executed by the processor of the digital computer system, cause the digital computer system to perform one or more action(s). For example, memory 122 in
Method 201 includes five acts 211, 212, 213, 214, and 215 (depicted by rectangular boxes in
At 211, the processor of the digital computer system causes a display of content at a first magnification level on an electronic display of the digital computer system. If the digital computer system includes (or is communicatively coupled to) multiple electronic displays of which at least one is a presentation monitor, then at 211 the processor advantageously causes a display of content at a first magnification level on at least one presentation monitor.
Throughout this specification and the appended claims, reference is often made to “content,” as in, for example, “display of content” at act 211. Unless the specific context requires otherwise, the term “content” is used in such instances to encompass any type of visual content, on its own or in any combination with other types of visual content. Examples of visual content include, without limitation: a computer's desktop, one or more software application window(s), content displayed in one or more software application window(s), one or more still image(s), one or more video(s), one or more animation(s), and any combination thereof. Thus, at 211 the processor causes a display of content on the electronic display by causing any of the above-listed examples (including any combination thereof) to display on the electronic display, either at full screen resolution or in windowed form.
At 212, the processor of the digital computer system receives a user input indicative of a magnification setting command. Throughout this specification and the appended claims, receipt, by a processor, of a user input indicative of a magnification setting command encompasses receipt, by the processor, of one or more signal(s) representative of the user input indicative of the magnification setting command (such as, for example, receipt by the processor of one or more signal(s) transmitted by a portable control device in response to a user input indicative of a magnification setting command detected by the portable control device). In implementations in which the digital computer system is itself a part of a larger display system that further comprises a portable control device (e.g., device 130 of display system 100), the portable control device may detect the user input indicative of the magnification setting command and, in response to detecting the user input indicative of the magnification setting command, transmit a first signal. Exemplary portable control device 130 in
At 220, a criterion is specified and this criterion must be met before method 201 proceeds to act 213. The criterion is that the processor must receive the user input indicative of a magnification setting command per act 212. Method 201 only proceeds to acts 213, 214, and 215 in response to the processor receiving a user input indicative of a magnification setting command. In some implementations, the processor-executable instructions and/or data may cause the processor to operate as a state machine where, in response to a user input (such as the user input indicative of a magnification setting command), the processor transitions from a first operational mode/state into a second operational mode/state. In this case, the state of the processor prior to criterion 220 being satisfied is a first operational state in which the processor performs act 211 of method 201 and the state of the processor after criterion 220 is satisfied (as triggered by act 212) is a second operational state in which the processor performs acts 213, 214, and 215 of method 201.
As previously described,
At 213, the processor of the digital computer system captures a digital copy image of the content that the processor caused to be displayed on the electronic display at 211. For example, the processor may capture or cause to be captured a screenshot or frame of the content being displayed on the electronic display. A digital copy image may generally take the form of a still image. If the content at 211 involves dynamic elements such as video or animation, the digital copy image may represent a single instance in time of such dynamic elements. Throughout this specification and the appended claims, to “capture a digital copy image of content” generally means to produce, or reproduce, a second version of content based on a first version of the content. This may be implemented in a variety of different ways, including without limitation: producing a copy of some or all of the data corresponding to the content that is stored in the memory of the system and writing that copy to the memory of the system and/or performing, by the processor of the system, a second execution of the all or a portion of the same processor-executable instructions that caused the processor to create the content in the first place and writing a result of the second execution of the processor-executable instructions to the memory of the system. In the former example, data that corresponds to the content may be stored as one or more file(s) in the memory of the system and capturing a digital image copy of the content may include copying all or portion(s) of the one or more file(s). In the latter example, processor-executable instructions that, when executed by the processor of the system a first time, cause the processor to create the content may be executed a second time to cause the processor to “re-create” all or a portion of the content as a digital copy image of the content. In either case, the content and the digital copy image of the content may respectively correspond to different data stored in one or more memory(ies) of the system.
At 214, the processor of the digital computer system digitally magnifies at least a portion of the digital copy image of the content that was captured at 213. Digitally magnifying at least a portion of the digital copy image of the content may include generating, by the processor, a second copy image of the content at a second magnification level, the second magnification level greater than the first magnification level.
At 215, the processor of the digital computer system causes a display of the digitally magnified at least a portion of the digital copy image of the content at a second magnification level on the electronic display of the digital computer system. The second magnification level may be greater than the first magnification level. The digitally magnified at least a portion of the digital copy image of the content may replace, substitute for, or overlay at least a portion of the content displayed on the electronic display at 211. Display of the digitally magnified at least a portion of the digital copy image of the content at the second magnification level may be presented as a single large discrete jump in magnification from the first magnification level to the second magnification level. Alternatively, display of the digitally magnified at least a portion of the digital copy image may be presented as a dynamic zoom (i.e., “zoom-in”) from the first magnification level to the second magnification level and thereby depict sequentially increasing levels of magnification in between the first magnification level and the second magnification level. In this dynamic zoom implementation, the processor may digitally magnify the at least a portion of the digital copy image of the content at 214 by, for example, producing or generating a series of digital copy images of respective portions of the content at successive (i.e., successively greater) magnification levels, and the processor may cause a display of the digitally magnified at least a portion of the digital copy image of the content at 215 by, for example, causing a sequential display of the series of digital copy images of respective portions of the content at successive (i.e., successively greater) magnification levels.
Because the display of the digitally magnified at least a portion of the digital copy image is a still image that replaces, substitutes for, or overlays at least a portion of the original content displayed on the electronic display at 211, it may be desirable to remove the digitally magnified at least a portion of the digital copy image from the electronic display after a period of time. For example, if the content that the processor causes to be displayed at 211 is one of many presentation slides, and the user/presenter magnifies or “zooms in on” at least a portion of that slide through acts 213, 214, and 215, then the user/presenter may wish to “unmagnify” or “zoom out of” the at least a portion of that slide as the presentation progresses (e.g., to move on to the next slide). Accordingly, method 201 may go on to include further acts not illustrated in
Depending on whether or not the display of the digitally magnified at least a portion of the digital copy image of the content fully overlays (e.g., at full screen) the content, some or all of the content may continue to be displayed on the electronic display while the digitally magnified at least a portion of the digital copy image of the content is being displayed. In applications in which the digitally magnified at least a portion of the digital copy image of the content is displayed at full screen resolution on the electronic display (i.e., the digitally magnified at least a portion of the content completely overlays the content), when the processor causes the display of the digitally magnified at least a portion of the digital copy image of the content at 215 the processor may stop (i.e., cause to stop) the display of the content from 211. In this situation, when the processor responds to a user input indicative of a display restoration command and stops (i.e., causes to stop) the display of the digitally magnified at least a portion of the digital copy image of the content, the processor may also, in response to the user input indicative of the display restoration command, cause a resumption of the display of the content at the first magnification level on the electronic display. In other words, act 211 of method 201 may be repeated.
In a similar way to how some implementations may present the display of the digitally magnified at least a portion of the digital copy image at 215 as a dynamic “zoom-in,” some implementations may display the stopping (i.e., the causing to stop) of the display of the digitally magnified at least a portion of the digital copy image as a dynamic “zoom out.” As previously described, the processor may digitally magnify the at least a portion of the digital copy image of the content at 214 by, for example, producing or generating a series of digital copy images of respective portions of the content at successively greater magnification levels. With access to this series of digital copy images of the content, the processor may stop (i.e., cause to stop) the display of the digitally magnified at least a portion of the digital copy image of the content by, for example, causing a sequential display of the series of digital copy images of respective portions of the content at successively lesser magnification levels.
Method 201, being a representation of the effect of a processor executing processor-executable instructions and/or data stored in computer program product 200, may be implemented in a variety of different ways. An exemplary implementation of method 201 that makes use of a transparent application window and a user-controlled pointer is provided in
Method 300 includes eight acts 301, 302, 303, 304, 305, 306, 307, and 308 (depicted by rectangular boxes in
At 301, the processor 121 causes a display of content at a first magnification level on the electronic display 110. Act 301 of method 300 is substantially similar to act 211 of method 201
At 302, the processor 121 causes an overlay of a borderless application window on at least a portion of the content on the electronic display 110. In this context, the term “borderless” means that the window does not have a visible edge or border, though the borderless window may still have a perimeter. The borderless window is in the foreground of electronic display 110, but the borderless window is transparent to content. “Transparent to content” means the borderless window does not obstruct or occlude the content displayed on the electronic display 110 at 301. In other words, the content displayed on the electronic display 110 at 301 projects through the borderless window without being affected. The borderless window is also transparent to a majority of events, meaning that a majority of user-effected or invoked events like keystrokes and mouse clicks “pass through” the borderless window to select or interact with the display content underneath, despite the borderless window being in the foreground. In a conventional digital computing environment, when multiple applications are open simultaneously only the window that is in the foreground is responsive to events (i.e., keystrokes, mouse clicks, etc.); conversely, the borderless window overlaid at 302 is transparent to a majority of events and it is the first window behind the borderless window that contains content (e.g., user selectable icons, fillable fields, user selectable images or text) that will respond to most events. A person of skill in the art will be familiar with the various parameters and flags that may be set when an application window is defined, including those which control background properties (colorless and transparent in this case) and responsiveness to events.
At 303, the processor 121 receives a user input indicative of a pointer command. The user input that is indicative of the pointer command may be effected or invoked by the user using, for example, a portable control device 130 communicatively coupled (e.g., via a wireless connection) to the processor 121. If the portable control device 130 is a gesture-based control device, then the user input indicative of the pointer command may correspond to a particular physical gesture performed by the user. The pointer command is one of the few select events to which the borderless window is responsive. Throughout this specification and the appended claims, receipt, by a processor, of a user input indicative of a pointer command encompasses receipt, by the processor, of one or more signal(s) representative of the user input indicative of the pointer command (such as, for example, receipt by the processor of one or more signal(s) transmitted by a portable control device in response to a user input indicative of a pointer command detected by the portable control device).
At 311, a criterion is specified and this criterion must be met before method 300 proceeds to act 304. The criterion is that the processor 121 must receive the user input indicative of the pointer command per act 303. Method 300 only proceeds to acts 304, 305, 306, 307, and 308 in response to, at least, the processor 121 receiving a user input indicative of a pointer command. As previously described,
At 304, the processor 121 causes a display of a dynamic cursor in the borderless window over top of a portion of the content on the electronic display 110 (i.e., over top of a portion of the content from act 301). The cursor may be opaque or partially transparent. The cursor is “dynamic” because, once its display has been triggered per act 304, the position of the cursor on the electronic display 110 (i.e., the position of the cursor in the borderless window and the portion of the content which the cursor overlies) is controllable and dynamically variable by the user. Once the dynamic cursor is displayed at 304, the user may controllably move the cursor around on the electronic display 110 to effectively point to specific regions of the content displayed thereon.
At 305, the processor 121 receives a user input indicative of a magnification setting command. Act 302 of method 300 is substantially similar to act 212 of method 201, with the added detail that, at 305 of method 300, the processor 121 receives the user input indicative of the magnification setting command while the cursor from act 304 is overlying a particular portion of the content on the electronic display 110. The magnification setting command is another one of the few select events to which the borderless window is responsive.
At 312, a criterion is specified and this criterion must be met before method 300 proceeds to act 306. The criterion is that the processor 121 must receive the user input indicative of the magnification setting command per act 305. Method 300 only proceeds to acts 306, 307, and 308 in response to both the processor 121 receiving a user input indicative of a pointer command at 303 and the processor 121 receiving a user input indicative of a magnification setting command at 305. As previously described,
At 306, the processor 121 captures a digital copy image of the content displayed on the electronic display 110 at 301. Act 306 of method 300 is substantially similar to act 213 of method 201. The digital copy image may or may not include the cursor from act 304.
At 307, the processor 121 digitally magnifies at least a region of the digital copy image of the content that includes the portion of the content over which the cursor was displayed when the processor 121 received the user input indicative of the magnification setting command at 305. Act 307 of method 300 is substantially similar to act 214 of method 201, with the added detail that, at 307 of method 300, the processor 121 digitally magnifies specifically a region of the digital copy image that includes the portion of the content displayed on the electronic display 121 over which the cursor is displayed in the borderless window. This feature enables the user to specifically select which region of the digital copy image to digitally magnify. For example, in acts 303 and 304, an input from the user triggers the processor 121 to cause a cursor to display over top of the content on the electronic display. The cursor is dynamic, and further input(s) from the user may trigger the processor 121 to cause the position of the cursor to change. When the position of the cursor aligns with (e.g., overlies, or is proximate to) a region of the content displayed on the electronic display 110 that the user wishes to magnify (i.e., to display at a greater magnification), the user triggers the processor 121 to execute acts 306, 307, and 308.
At 308, the processor 121 causes a display of the digitally magnified region of the digital copy image of the content at a second magnification level in the borderless window on the electronic display 110. Act 308 of method 300 is substantially similar to act 215 of method 201, with the added detail that, at 308 of method 300, the digitally magnified region of the digital copy image specifically includes a portion of the content from act 301 that is proximate, overlaid by, or generally in the region/vicinity of the cursor displayed (and, optionally, controllably displaced on the display by the user) at 304.
As previously described, the present systems, methods, and computer program products may be used in conjunction with a wide variety of electronic display technologies and with virtually any form of content displayed thereon. However, the present systems, methods, and computer program products are particularly well-suited for use in applications when the content being displayed includes presentation materials (e.g., one or more presentation slides) displayed to an audience on one or more presentation monitor(s). In particular, the various embodiments described herein may be used in conjunction with conventional presentation software (e.g., PowerPoint® et al.) to add either or both of the pointer functionality and/or the magnification/zoom functionality to a presentation involving such software.
In
With dynamic cursor 450 displayed in the borderless window over top of the content on electronic display 410, user 401 is able to control the position of dynamic cursor 450 (much like controlling the position of a typical mouse-controlled cursor) via gesture-based control device 430. The solid arrows adjacent to the hand of user 401 in
At this stage in exemplary presentation 400, user 401 performs a physical gesture (e.g., a first or a finger spread gesture) that gesture-based control device 430 interprets as a user input indicative of a magnification setting command. In this example, the user input indicative of a magnification setting command involves a rotation of the user's arm. In response to the arm rotation (e.g., during a first gesture), gesture-based control device 430 wirelessly transmits a signal that is received (per act 305 of method 300) by the processor of the display system and, in response to the signal (per criterion 312 of method 300), the display system executes acts 306, 307, and 308 of method 300. Exemplary presentation 400 illustrates an application of the present systems, methods, and computer program products in which the transition from the first magnification level to the second magnification level is displayed as a multi-stage dynamic zoom. In other words, in presentation 400 user 401 triggers, via gesture-based control device 430, a dynamic zoom-in on the portion/region of the content displayed on electronic display 410 over which cursor 450 is positioned in
Cumulatively,
Throughout this specification, reference is often made to “conventional presentation software” and “enhancing” or otherwise adapting conventional presentation software. The use of a borderless application window that is transparent to both content and a majority of events is an aspect of the present systems, methods, and computer program products that is particularly well-suited to enable the present systems, methods, and computer program products to be used in conjunction with, and enhance, conventional presentation software. The borderless application window is further compatible for use in conjunction with virtually any displayed content; thus, a user may trigger a pointer and zoom in on the pointer position even when no conventional presentation software is running (e.g., the user may display a pointer on and then zoom in on their desktop, or virtually any application window displayed on an electronic display, such as for example a map application, a database or spreadsheet application, or a web browser). However, one or more variation(s) of the present systems, methods, and computer program products may also be implemented in dedicated presentation software that may or may not incorporate the use of a borderless application window as described herein.
Throughout this specification and the appended claims the term “communicative” as in “communicative pathway,” “communicative coupling,” and in variants such as “communicatively coupled,” is generally used to refer to any engineered arrangement for transferring and/or exchanging information. Exemplary communicative pathways include, but are not limited to, electrically conductive pathways (e.g., electrically conductive wires, electrically conductive traces), magnetic pathways (e.g., magnetic media), one or more communicative link(s) through one or more wireless communication protocol(s), and/or optical pathways (e.g., optical fiber), and exemplary communicative couplings include, but are not limited to, electrical couplings, magnetic couplings, wireless couplings, and/or optical couplings.
Throughout this specification and the appended claims, infinitive verb forms are often used. Examples include, without limitation: “to detect,” “to provide,” “to transmit,” “to communicate,” “to process,” “to route,” and the like. Unless the specific context requires otherwise, such infinitive verb forms are used in an open, inclusive sense, that is as “to, at least, detect,” “to, at least, provide,” “to, at least, transmit,” and so on.
The above description of illustrated embodiments, including what is described in the Abstract, is not intended to be exhaustive or to limit the embodiments to the precise forms disclosed. Although specific embodiments of and examples are described herein for illustrative purposes, various equivalent modifications can be made without departing from the spirit and scope of the disclosure, as will be recognized by those skilled in the relevant art. The teachings provided herein of the various embodiments can be applied to other portable and/or wearable electronic devices, not necessarily the exemplary wearable electronic devices generally described above.
For instance, the foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, schematics, and examples. Insofar as such block diagrams, schematics, and examples contain one or more functions and/or operations, it will be understood by those skilled in the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, the present subject matter may be implemented via Application Specific Integrated Circuits (ASICs). However, those skilled in the art will recognize that the embodiments disclosed herein, in whole or in part, can be equivalently implemented in standard integrated circuits, as one or more computer programs executed by one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs executed by on one or more controllers (e.g., microcontrollers) as one or more programs executed by one or more processors (e.g., microprocessors, central processing units, graphical processing units), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of ordinary skill in the art in light of the teachings of this disclosure.
When logic is implemented as software and stored in memory, logic or information can be stored on any processor-readable medium for use by or in connection with any processor-related system or method. In the context of this disclosure, a memory is a processor-readable medium that is an electronic, magnetic, optical, or other physical device or means that contains or stores a computer and/or processor program. Logic and/or the information can be embodied in any processor-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions associated with logic and/or information.
In the context of this specification, a “non-transitory processor-readable medium” can be any element that can store the program associated with logic and/or information for use by or in connection with the instruction execution system, apparatus, and/or device. The processor-readable medium can be, for example, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus or device. More specific examples (a non-exhaustive list) of the computer readable medium would include the following: a portable computer diskette (magnetic, compact flash card, secure digital, or the like), a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM, EEPROM, or Flash memory), a portable compact disc read-only memory (CDROM), digital tape, and other non-transitory media.
The various embodiments described above can be combined to provide further embodiments. To the extent that they are not inconsistent with the specific teachings and definitions herein, all of the U.S. patents, U.S. patent application publications, U.S. patent applications, foreign patents, foreign patent applications and non-patent publications referred to in this specification and/or listed in the Application Data Sheet, including but not limited to: U.S. Provisional Patent Application Ser. No. 62/152,151, US Patent Publication US 2014-0240103 A1, US Patent Publication US 2015-0057770 A1, US Patent Publication US 2015-0070270 A1, U.S. Non-Provisional patent application Ser. No. 14/658,552 (now US Patent Publication US 2015-0261306 A1), and/or U.S. Non-Provisional patent application Ser. No. 14/679,850 (now US Patent Publication US 2015-0296553 A1), are incorporated herein by reference, in their entirety. Aspects of the embodiments can be modified, if necessary, to employ systems, circuits and concepts of the various patents, applications and publications to provide yet further embodiments.
These and other changes can be made to the embodiments in light of the above-detailed description. In general, in the following claims, the terms used should not be construed to limit the claims to the specific embodiments disclosed in the specification and the claims, but should be construed to include all possible embodiments along with the full scope of equivalents to which such claims are entitled. Accordingly, the claims are not limited by the disclosure.
Number | Name | Date | Kind |
---|---|---|---|
1411995 | Dull | Apr 1922 | A |
3620208 | Higley et al. | Nov 1971 | A |
3880146 | Everett et al. | Apr 1975 | A |
4602639 | Hoogendoorn et al. | Jul 1986 | A |
4705408 | Jordi | Nov 1987 | A |
4817064 | Milles | Mar 1989 | A |
5003978 | Dunseath, Jr. | Apr 1991 | A |
D322227 | Warhol | Dec 1991 | S |
5081852 | Cox | Jan 1992 | A |
5251189 | Thorp | Oct 1993 | A |
D348660 | Parsons | Jul 1994 | S |
5445869 | Ishikawa et al. | Aug 1995 | A |
5482051 | Reddy et al. | Jan 1996 | A |
5605059 | Woodward | Feb 1997 | A |
5683404 | Johnson | Nov 1997 | A |
6032530 | Hock | Mar 2000 | A |
6184847 | Fateh et al. | Feb 2001 | B1 |
6238338 | DeLuca et al. | May 2001 | B1 |
6244873 | Hill et al. | Jun 2001 | B1 |
6377277 | Yamamoto | Apr 2002 | B1 |
D459352 | Giovanniello | Jun 2002 | S |
6487906 | Hock | Dec 2002 | B1 |
6510333 | Licata et al. | Jan 2003 | B1 |
6527711 | Stivoric et al. | Mar 2003 | B1 |
6619836 | Silvant et al. | Sep 2003 | B1 |
6720984 | Jorgensen et al. | Apr 2004 | B1 |
6743982 | Biegelsen et al. | Jun 2004 | B2 |
6807438 | Brun Del Re et al. | Oct 2004 | B1 |
D502661 | Rapport | Mar 2005 | S |
D502662 | Rapport | Mar 2005 | S |
6865409 | Getsla et al. | Mar 2005 | B2 |
D503646 | Rapport | Apr 2005 | S |
6880364 | Vidolin et al. | Apr 2005 | B1 |
6927343 | Watanabe et al. | Aug 2005 | B2 |
6965842 | Rekimoto | Nov 2005 | B2 |
6972734 | Ohshima et al. | Dec 2005 | B1 |
6984208 | Zheng | Jan 2006 | B2 |
7022919 | Brist et al. | Apr 2006 | B2 |
7086218 | Pasach | Aug 2006 | B1 |
D535401 | Travis et al. | Jan 2007 | S |
7173437 | Hervieux et al. | Feb 2007 | B2 |
7209114 | Radley-Smith | Apr 2007 | B2 |
D543212 | Marks | May 2007 | S |
7265298 | Maghribi et al. | Sep 2007 | B2 |
7271774 | Puuri | Sep 2007 | B2 |
7333090 | Tanaka et al. | Feb 2008 | B2 |
7450107 | Radley-Smith | Nov 2008 | B2 |
7491892 | Wagner et al. | Feb 2009 | B2 |
7517725 | Reis | Apr 2009 | B2 |
7558622 | Tran | Jul 2009 | B2 |
7596393 | Jung et al. | Sep 2009 | B2 |
7618260 | Daniel et al. | Nov 2009 | B2 |
7636549 | Ma et al. | Dec 2009 | B2 |
7640007 | Chen et al. | Dec 2009 | B2 |
7660126 | Cho et al. | Feb 2010 | B2 |
7809435 | Ettare et al. | Oct 2010 | B1 |
7844310 | Anderson | Nov 2010 | B2 |
7870211 | Pascal et al. | Jan 2011 | B2 |
7925100 | Howell et al. | Apr 2011 | B2 |
7948763 | Chuang | May 2011 | B2 |
D643428 | Janky et al. | Aug 2011 | S |
D646192 | Woode | Oct 2011 | S |
8054061 | Prance et al. | Nov 2011 | B2 |
D654622 | Hsu | Feb 2012 | S |
8170656 | Tan et al. | May 2012 | B2 |
8179604 | Prada Gomez et al. | May 2012 | B1 |
8188937 | Amafuji et al. | May 2012 | B1 |
D661613 | Demeglio | Jun 2012 | S |
8203502 | Chi et al. | Jun 2012 | B1 |
8207473 | Axisa et al. | Jun 2012 | B2 |
8212859 | Tang et al. | Jul 2012 | B2 |
8355671 | Kramer et al. | Jan 2013 | B2 |
8389862 | Arora et al. | Mar 2013 | B2 |
8421634 | Tan et al. | Apr 2013 | B2 |
8427977 | Workman et al. | Apr 2013 | B2 |
D682727 | Bulgari | May 2013 | S |
8447704 | Tan et al. | May 2013 | B2 |
8467270 | Gossweiler, III et al. | Jun 2013 | B2 |
8469741 | Oster et al. | Jun 2013 | B2 |
D689862 | Liu | Sep 2013 | S |
8591411 | Banet et al. | Nov 2013 | B2 |
D695454 | Moore | Dec 2013 | S |
8620361 | Bailey et al. | Dec 2013 | B2 |
8624124 | Koo et al. | Jan 2014 | B2 |
8702629 | Giuffrida et al. | Apr 2014 | B2 |
8704882 | Turner | Apr 2014 | B2 |
8777668 | Ikeda et al. | Jul 2014 | B2 |
D716457 | Brefka et al. | Oct 2014 | S |
D717685 | Bailey et al. | Nov 2014 | S |
8879276 | Wang | Nov 2014 | B2 |
8883287 | Boyce et al. | Nov 2014 | B2 |
8895865 | Lenahan et al. | Nov 2014 | B2 |
8912094 | Koo et al. | Dec 2014 | B2 |
8922481 | Kauffmann et al. | Dec 2014 | B1 |
8954135 | Yuen et al. | Feb 2015 | B2 |
8970571 | Wong et al. | Mar 2015 | B1 |
8971023 | Olsson et al. | Mar 2015 | B2 |
9018532 | Wesselmann et al. | Apr 2015 | B2 |
9086687 | Park et al. | Jul 2015 | B2 |
D736664 | Paradise et al. | Aug 2015 | S |
9146730 | Lazar | Sep 2015 | B2 |
D741855 | Park et al. | Oct 2015 | S |
D742272 | Bailey et al. | Nov 2015 | S |
D742874 | Cheng et al. | Nov 2015 | S |
D743963 | Osterhout | Nov 2015 | S |
9211417 | Heldman et al. | Dec 2015 | B2 |
D747714 | Erbeus | Jan 2016 | S |
D750623 | Park et al. | Mar 2016 | S |
D751065 | Magi | Mar 2016 | S |
9299248 | Lake et al. | Mar 2016 | B2 |
D756359 | Bailey et al. | May 2016 | S |
9367139 | Ataee et al. | Jun 2016 | B2 |
9372535 | Bailey et al. | Jun 2016 | B2 |
9393418 | Giuffrida et al. | Jul 2016 | B2 |
9418927 | Axisa et al. | Aug 2016 | B2 |
9439566 | Arne et al. | Sep 2016 | B2 |
9472956 | Michaelis et al. | Oct 2016 | B2 |
9477313 | Mistry et al. | Oct 2016 | B2 |
9529434 | Choi et al. | Dec 2016 | B2 |
20020032386 | Sackner et al. | Mar 2002 | A1 |
20020077534 | DuRousseau | Jun 2002 | A1 |
20030036691 | Stanaland et al. | Feb 2003 | A1 |
20030051505 | Robertson et al. | Mar 2003 | A1 |
20030144586 | Tsubata | Jul 2003 | A1 |
20040073104 | Brun del Re et al. | Apr 2004 | A1 |
20040194500 | Rapport | Oct 2004 | A1 |
20040210165 | Marmaropoulos et al. | Oct 2004 | A1 |
20050005637 | Rapport | Jan 2005 | A1 |
20050012715 | Ford | Jan 2005 | A1 |
20050070227 | Shen et al. | Mar 2005 | A1 |
20050119701 | Lauter et al. | Jun 2005 | A1 |
20050177038 | Kolpin et al. | Aug 2005 | A1 |
20060037359 | Stinespring | Feb 2006 | A1 |
20060061544 | Min et al. | Mar 2006 | A1 |
20070132785 | Ebersole, Jr. et al. | Jun 2007 | A1 |
20080136775 | Conant | Jun 2008 | A1 |
20090007597 | Hanevold | Jan 2009 | A1 |
20090031757 | Harding | Feb 2009 | A1 |
20090040016 | Ikeda | Feb 2009 | A1 |
20090051544 | Niknejad | Feb 2009 | A1 |
20090102580 | Uchaykin | Apr 2009 | A1 |
20090109241 | Tsujimoto | Apr 2009 | A1 |
20090147004 | Ramon | Jun 2009 | A1 |
20090179824 | Tsujimoto et al. | Jul 2009 | A1 |
20090189867 | Krah et al. | Jul 2009 | A1 |
20090251407 | Flake et al. | Oct 2009 | A1 |
20090258669 | Nie et al. | Oct 2009 | A1 |
20090318785 | Ishikawa et al. | Dec 2009 | A1 |
20100041974 | Ting et al. | Feb 2010 | A1 |
20100280628 | Sankai | Nov 2010 | A1 |
20100293115 | Seyed Momen | Nov 2010 | A1 |
20100317958 | Beck et al. | Dec 2010 | A1 |
20110018754 | Tojima et al. | Jan 2011 | A1 |
20110072510 | Cheswick | Mar 2011 | A1 |
20110134026 | Kang et al. | Jun 2011 | A1 |
20110166434 | Gargiulo | Jul 2011 | A1 |
20110172503 | Knepper et al. | Jul 2011 | A1 |
20110181527 | Capela | Jul 2011 | A1 |
20110213278 | Horak et al. | Sep 2011 | A1 |
20110224556 | Moon et al. | Sep 2011 | A1 |
20110224564 | Moon et al. | Sep 2011 | A1 |
20120029322 | Wartena et al. | Feb 2012 | A1 |
20120051005 | Vanfleteren et al. | Mar 2012 | A1 |
20120053439 | Ylostalo et al. | Mar 2012 | A1 |
20120101357 | Hoskuldsson et al. | Apr 2012 | A1 |
20120157789 | Kangas et al. | Jun 2012 | A1 |
20120165695 | Kidmose et al. | Jun 2012 | A1 |
20120182309 | Griffin et al. | Jul 2012 | A1 |
20120188158 | Tan et al. | Jul 2012 | A1 |
20120203076 | Fatta et al. | Aug 2012 | A1 |
20120209134 | Morita et al. | Aug 2012 | A1 |
20120226130 | De Graff et al. | Sep 2012 | A1 |
20120265090 | Fink et al. | Oct 2012 | A1 |
20120293548 | Perez et al. | Nov 2012 | A1 |
20120302858 | Kidmose et al. | Nov 2012 | A1 |
20120323521 | De Foras et al. | Dec 2012 | A1 |
20130005303 | Song et al. | Jan 2013 | A1 |
20130020948 | Han et al. | Jan 2013 | A1 |
20130027341 | Mastandrea | Jan 2013 | A1 |
20130080794 | Hsieh | Mar 2013 | A1 |
20130127708 | Jung et al. | May 2013 | A1 |
20130165813 | Chang et al. | Jun 2013 | A1 |
20130191741 | Dickinson et al. | Jul 2013 | A1 |
20130198694 | Rahman et al. | Aug 2013 | A1 |
20130265229 | Forutanpour et al. | Oct 2013 | A1 |
20130265437 | Thürn et al. | Oct 2013 | A1 |
20130271292 | McDermott | Oct 2013 | A1 |
20130312256 | Wesselmann et al. | Nov 2013 | A1 |
20130317648 | Assad | Nov 2013 | A1 |
20130332196 | Pinsker | Dec 2013 | A1 |
20140020945 | Hurwitz et al. | Jan 2014 | A1 |
20140028539 | Newham | Jan 2014 | A1 |
20140028546 | Jeon et al. | Jan 2014 | A1 |
20140045547 | Singamsetty et al. | Feb 2014 | A1 |
20140049417 | Abdurrahman et al. | Feb 2014 | A1 |
20140094675 | Luna et al. | Apr 2014 | A1 |
20140121471 | Walker | May 2014 | A1 |
20140122958 | Greenebrg et al. | May 2014 | A1 |
20140157168 | Albouyeh | Jun 2014 | A1 |
20140194062 | Palin et al. | Jul 2014 | A1 |
20140198034 | Bailey et al. | Jul 2014 | A1 |
20140198035 | Bailey et al. | Jul 2014 | A1 |
20140236031 | Banet et al. | Aug 2014 | A1 |
20140240103 | Lake et al. | Aug 2014 | A1 |
20140249397 | Lake et al. | Sep 2014 | A1 |
20140257141 | Giuffrida et al. | Sep 2014 | A1 |
20140285326 | Luna et al. | Sep 2014 | A1 |
20140299362 | Park et al. | Oct 2014 | A1 |
20140334083 | Bailey | Nov 2014 | A1 |
20140334653 | Luna et al. | Nov 2014 | A1 |
20140337861 | Chang et al. | Nov 2014 | A1 |
20140340857 | Hsu et al. | Nov 2014 | A1 |
20140349257 | Connor | Nov 2014 | A1 |
20140375465 | Fenuccio et al. | Nov 2014 | A1 |
20140354528 | Laughlin et al. | Dec 2014 | A1 |
20140354529 | Laughlin et al. | Dec 2014 | A1 |
20140364703 | Kim et al. | Dec 2014 | A1 |
20150011857 | Henson et al. | Jan 2015 | A1 |
20150025355 | Bailey et al. | Jan 2015 | A1 |
20150051470 | Bailey et al. | Feb 2015 | A1 |
20150057506 | Luna et al. | Feb 2015 | A1 |
20150057770 | Bailey et al. | Feb 2015 | A1 |
20150065840 | Bailey | Mar 2015 | A1 |
20150084860 | Aleem et al. | Mar 2015 | A1 |
20150106052 | Balakrishnan et al. | Apr 2015 | A1 |
20150109202 | Ataee et al. | Apr 2015 | A1 |
20150124566 | Lake et al. | May 2015 | A1 |
20150141784 | Morun et al. | May 2015 | A1 |
20150148641 | Morun et al. | May 2015 | A1 |
20150160621 | Yilmaz | Jun 2015 | A1 |
20150182113 | Utter, II | Jul 2015 | A1 |
20150182130 | Utter, II | Jul 2015 | A1 |
20150182163 | Utter | Jul 2015 | A1 |
20150182164 | Utter, II | Jul 2015 | A1 |
20150185838 | Camacho-Perez et al. | Jul 2015 | A1 |
20150186609 | Utter, II | Jul 2015 | A1 |
20150216475 | Luna et al. | Aug 2015 | A1 |
20150230756 | Luna et al. | Aug 2015 | A1 |
20150234426 | Bailey et al. | Aug 2015 | A1 |
20150237716 | Su et al. | Aug 2015 | A1 |
20150261306 | Lake | Sep 2015 | A1 |
20150277575 | Ataee et al. | Oct 2015 | A1 |
20150296553 | DiFranco et al. | Oct 2015 | A1 |
20150325202 | Lake et al. | Nov 2015 | A1 |
20150370333 | Ataee et al. | Dec 2015 | A1 |
20160020500 | Matsuda | Jan 2016 | A1 |
20160150636 | Otsubo | May 2016 | A1 |
20160156762 | Bailey et al. | Jun 2016 | A1 |
20160199699 | Klassen | Jul 2016 | A1 |
20160202081 | Debieuvre et al. | Jul 2016 | A1 |
20160274758 | Bailey | Sep 2016 | A1 |
20160309249 | Wu et al. | Oct 2016 | A1 |
Number | Date | Country |
---|---|---|
102246125 | Nov 2011 | CN |
44 12 278 | Jan 1995 | DE |
0 301 790 | Feb 1989 | EP |
2009-50679 | Mar 2009 | JP |
10-2012-0094870 | Aug 2012 | KR |
10-2012-0097997 | Sep 2012 | KR |
2011070554 | Jun 2011 | WO |
Entry |
---|
Communication pursuant to Rule 164(1) EPC, dated Sep. 30, 2016, for corresponding EP Application No. 14753949.8, 7 pages. |
Costanza et al., “EMG as a Subtle Input Interface for Mobile Computing,” Mobile HCI 2004, LNCS 3160, edited by S. Brewster and M. Dunlop, Springer-Verlag Berlin Heidelberg, pp. 426-430, 2004. |
Costanza et al., “Toward Subtle Intimate Interfaces for Mobile Devices Using an EMG Controller,” CHI 2005, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 481-489, 2005. |
Ghasemzadeh et al., “A Body Sensor Network With Electromyogram and Inertial Sensors: Multimodal Interpretation of Muscular Activities,” IEEE Transactions on Information Technology in Biomedicine, vol. 14, No. 2, pp. 198-206, Mar. 2010. |
Gourmelon et al., “Contactless sensors for Surface Electromyography,” Proceedings of the 28th IEEE EMBS Annual International Conference, New York City, NY, Aug. 30-Sep. 3, 2006, pp. 2514-2517. |
International Search Report and Written Opinion, dated May 16, 2014, for corresponding International Application No. PCT/US2014/017799, 9 pages. |
International Search Report and Written Opinion, dated Aug. 21, 2014, for corresponding International Application No. PCT/US2014/037863, 10 pages. |
International Search Report and Written Opinion, dated Nov. 21, 2014, for corresponding International Application No. PCT/US2014/052143, 9 pages. |
International Search Report and Written Opinion, dated Feb. 27, 2015, for corresponding International Application No. PCT/US2014/067443, 10 pages. |
International Search Report and Written Opinion, dated May 27, 2015, for corresponding International Application No. PCT/US2015/015675, 9 pages. |
Morris et al., “Emerging Input Technologies for Always-Available Mobile Interaction,” Foundations and Trends in Human-Computer Interaction 4(4):245-316, 2010. (74 total pages). |
Naik et al., “Real-Time Hand Gesture Identification for Human Computer Interaction Based on ICA of Surface Electromyogram,” IADIS International Conference Interfaces and Human Computer Interaction 2007, 8 pages. |
Picard et al., “Affective Wearables,” Proceedings of the IEEE 1st International Symposium on Wearable Computers, ISWC, Cambridge, MA, USA, Oct. 13-14, 1997, pp. 90-97. |
Rekimoto, “GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices,” ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers, 2001, 7 pages. |
Saponas et al., “Making Muscle-Computer Interfaces More Practical,” CHI 2010, Atlanta, Georgia, USA, Apr. 10-15, 2010, 4 pages. |
Sato et al., “Touche: Enhancing Touch Interaction on Humans, Screens, Liquids, and Everyday Objects,” CHI' 12, May 5-10, 2012, Austin, Texas. |
Ueno et al., “A Capacitive Sensor System for Measuring Laplacian Electromyogram through Cloth: A Pilot Study,” Proceedings of the 29th Annual International Conference of the IEEE EMBS, Cite Internationale, Lyon, France, Aug. 23-26, 2007. |
Ueno et al., “Feasibility of Capacitive Sensing of Surface Electromyographic Potential through Cloth,” Sensors and Materials 24(6):335-346, 2012. |
Xiong et al., “A Novel HCI based on EMG and IMU,” Proceedings of the 2011 IEEE International Conference on Robotics and Biomimetics, Phuket, Thailand, Dec. 7-11, 2011, 5 pages. |
Xu et al., “Hand Gesture Recognition and Virtual Game Control Based on 3D Accelerometer and EMG Sensors,” Proceedings of the 14th international conference on Intelligent user interfaces, Sanibel Island, Florida, Feb. 8-11, 2009, pp. 401-406. |
Zhang et al., “A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors,” IEEE Transactions on Systems, Man, and Cybernetics—Part A: Systems and Humans, vol. 41, No. 6, pp. 1064-1076, Nov. 2011. |
Brownlee, “Finite State Machines (FSM): Finite state machines as a control technique in Artificial Intelligence (AI),” Jun. 2002, 12 pages. |
Janssen, “Radio Frequency (RF)” 2013, retrieved from https://web.archive.org/web/20130726153946/https://www.techopedia.com/definition/5083/radio-frequency-rf, retrieved on Jul. 12, 2017, 2 pages. |
Merriam-Webster, “Radio Frequencies” retrieved from https://www.merriam-webster.com/table/collegiate/radiofre.htm, retrieved on Jul. 12, 2017, 2 pages. |
Number | Date | Country | |
---|---|---|---|
20160313899 A1 | Oct 2016 | US |
Number | Date | Country | |
---|---|---|---|
62152151 | Apr 2015 | US |