A portion of the disclosure of this patent document contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever.
Mobile computing devices, such as smartphones and tablets, are increasingly being utilized in lieu of standalone cameras for capturing photographs of whiteboards, blackboards (i.e., a writing surface having a colored background) and documents in association with various productivity scenarios in the workplace (e.g., meetings comprising slide presentations, brainstorming sessions and the like). The captured photographic images may then be utilized in one or more productivity applications for generating electronic documents. The aforementioned capturing of photographic images however, suffers from a number of drawbacks. For example, many photographs must be taken at an angle (which may be due to the physical dimension limitations of the room in which a user is located) as well as in less than ideal lighting conditions (e.g., due to glare from incident lights in a meeting room). As a result, captured photographic images often contain unwanted perspective skews as well as unwanted regions (e.g., walls outside a whiteboard frame or table surfaces outside a document page boundary) which must be rectified prior to utilizing the images in other applications (e.g., productivity application software). It is with respect to these considerations and others that the various embodiments of the present invention have been made.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended as an aid in determining the scope of the claimed subject matter.
Embodiments provide a user experience for processing and cropping images. A menu of image processing modes may be displayed by a computing device. A selection of one of the image processing modes from the menu may then be received. An image may then be received by the computing device. The computing device may then process the received image based on the selected image processing mode. The computing device may then display user controls overlaying the processed image. A selection of one or more of the user controls may then be received to re-frame one or more boundaries of the processed image. The computing device may then send the processed and re-framed image to a productivity application.
These and other features and advantages will be apparent from a reading of the following detailed description and a review of the associated drawings. It is to be understood that both the foregoing general description and the following detailed description are illustrative only and are not restrictive of the invention as claimed.
Embodiments provide a user experience for processing and cropping images. A menu of image processing modes may be displayed by a computing device. A selection of one of the image processing modes from the menu may then be received. An image may then be received by the computing device. The computing device may then process the received image based on the selected image processing mode. The computing device may then display user controls overlaying the processed image. A selection of one or more of the user controls may then be received to re-frame one or more boundaries of the processed image. The computing device may then send the processed and re-framed image to a productivity application.
In the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These embodiments may be combined, other embodiments may be utilized, and structural changes may be made without departing from the spirit or scope of the present invention. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present invention is defined by the appended claims and their equivalents.
Referring now to the drawings, in which like numerals represent like elements through the several figures, various aspects of the present invention will be described.
The user interface may include the user controls 15, 17 and 19 (discussed above with respect to
In accordance with an embodiment, the document 20 may comprise a physical document (e.g., paper) containing information discussed during a meeting or presentation in an office, meeting room, school classroom or other work environment. The whiteboard 24 may comprise a physical markerboard, dry-erase board, dry-wipe board or pen-board utilized for recording notes, sketches, etc. during a meeting or presentation in an office, meeting room, school classroom or other work environment.
As will be described in greater detail below, the productivity application 30, in accordance with an embodiment, may comprise a free-form information gathering and multi-user collaboration application program configured for capturing notes (handwritten or typed) and drawings from the document 20 and/or the whiteboard 24 as images, and which is further configured for processing the images so that they may be utilized by the productivity application 30 and/or the other applications 40. In accordance with an embodiment, the productivity application 30 may comprise the ONENOTE note-taking software from MICROSOFT CORPORATION of Redmond Wash. It should be understood, however, that other productivity applications (including those from other manufacturers) may alternatively be utilized in accordance with the various embodiments described herein. It should be understood that the other applications 40 may include additional productivity application software which may receive the processed images from the productivity application 30. For example, the other applications 40 may include, without limitation, word processing software, presentation graphics software, spreadsheet software, diagramming software, project management software, publishing software and personal information management software. It should be appreciated that the aforementioned software applications may comprise individual application programs or alternatively, may be incorporated into a suite of applications such as the OFFICE application program suite from MICROSOFT CORPORATION of Redmond, Wash.
The routine 900 begins at operation 905, where the productivity application 30 executing on the computing device 10, may display an image processing mode menu to a user. For example, the image processing mode menu may include options for selecting a whiteboard processing mode (i.e., for whiteboard images) and a document processing mode (i.e., for document images).
From operation 905, the routine 900 continues to operation 910, where the productivity application 30 executing on the computing device 10, may receive a selection of an image processing mode from the menu. For example, the menu may comprise graphical user interface buttons from which a user may select either a whiteboard processing mode or a document processing mode by making either a tap gesture or a swipe gesture to select the desired mode. It should be understood that, in one embodiment, if a user selects the whiteboard processing mode for a non-whiteboard image (e.g., a blackboard object), the productivity application 30 may be configured to automatically classify the image as a blackboard object and utilize the document image processing mode thereon.
From operation 910, the routine 900 continues to operation 915, where the productivity application 30 executing on the computing device 10, may receive an image to be processed. For example, the productivity application 30 may receive an image captured by a user via an image capture device (e.g., a camera) or retrieved from an image library.
From operation 915, the routine 900 continues to operation 920, where the productivity application 30 executing on the computing device 10, may receive another selection of an image processing mode from the menu displayed at operation 905. In particular, it should be understood that in some embodiments, a user may select image processing modes both before and after capturing images.
From operation 920, the routine 900 continues to operation 925, where the productivity application 30 executing on the computing device 10, may process the image received at operation 915. In particular, the productivity application 30 may be configured to execute one or more image processing and cropping algorithms to enhance the quality of whiteboard and document images (e.g., providing color balance and removing background noise, stains and glare that may be present in a raw image) and attempt to correct any skew that may be present.
From operation 925, the routine 900 continues to operation 930, where the productivity application 30 executing on the computing device 10, may receive another image while processing the previous image received at operation 915. In particular, it should be understood that in some embodiments, the productivity application 30 may be configured to allow a user to receive and process multiple images simultaneously (i.e., the productivity application 30 may receive a new image while a previously received image is being processed).
From operation 930, the routine 900 continues to operation 935, where the productivity application 30 executing on the computing device 10, may display user controls which may be selected by a user to re-frame a received image (e.g., the image which was processed at operation 925). For example, as discussed above with respect to
From operation 935, the routine 900 continues to operation 940, where the productivity application 30 executing on the computing device 10, may receive a selection of the user controls displayed at operation 935 to re-frame a processed image. In particular, a user may re-frame one or more boundaries of the processed image selecting border and/or edge controls to crop the sides of a quadrangle framing the processed image. In one embodiment, tapping and dragging an edge of the quadrangle may move two sides of the quadrangle simultaneously. For example, tapping and dragging a right bottom edge of the quadrangle moves the right and bottom sides while tapping and dragging a right top edge of the quadrangle moves the right and top sides. In one embodiment, tapping and dragging the aforementioned user controls may change a color at a point of impact. For example, tapping and dragging a right bottom edge of the quadrangle moves the right and bottom sides with the right bottom edge and the right and bottom sides of the quadrangle changing color. In one embodiment, tapping and dragging a side of the quadrangle proportionally moves two adjacent sides. For example, tapping and dragging a bottom side of the quadrangle proportionally moves the left and right sides. In one embodiment, the productivity application 30 may be configured to allow a user to tap and drag the side of the quadrangle in order to adjust the quadrangle when an edge of the quadrangle beyond an image boundary.
From operation 940, the routine 900 continues to operation 945, where the productivity application 30 executing on the computing device 10, may send a re-framed processed image to the productivity application 30 or the other applications 40, for multi-purpose sharing and archiving purposes. From operation 945, the routine 900 then ends.
The computing device 1000 may have additional features or functionality. For example, the computing device 1000 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, solid state storage devices (“SSD”), flash memory or tape. Such additional storage is illustrated in
Furthermore, various embodiments may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, various embodiments may be practiced via a system-on-a-chip (“SOC”) where each or many of the components illustrated in
The term computer readable media as used herein may include computer storage media. Computer storage media may include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. The system memory 1004, the removable storage device 1009, and the non-removable storage device 1010 are all computer storage media examples (i.e., memory storage.) Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by the computing device 1000. Any such computer storage media may be part of the computing device 1000. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
Mobile computing device 1150 incorporates output elements, such as display 1125, which can display a graphical user interface (GUI). Other output elements include speaker 1130 and LED 1180. Additionally, mobile computing device 1150 may incorporate a vibration module (not shown), which causes mobile computing device 1150 to vibrate to notify the user of an event. In yet another embodiment, mobile computing device 1150 may incorporate a headphone jack (not shown) for providing another means of providing output signals.
Although described herein in combination with mobile computing device 1150, in alternative embodiments may be used in combination with any number of computer systems, such as in desktop environments, laptop or notebook computer systems, multiprocessor systems, micro-processor based or programmable consumer electronics, network PCs, mini computers, main frame computers and the like. Various embodiments may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network in a distributed computing environment; programs may be located in both local and remote memory storage devices. To summarize, any computer system having a plurality of environment sensors, a plurality of output elements to provide notifications to a user and a plurality of notification event types may incorporate the various embodiments described herein.
Application 1167 may be loaded into memory 1162 and run on or in association with an operating system 1164. The system 1102 also includes non-volatile storage 1168 within memory the 1162. Non-volatile storage 1168 may be used to store persistent information that should not be lost if system 1102 is powered down. The application 1167 may use and store information in the non-volatile storage 1168. The application 1167, for example, may comprise functionality for performing routines including, for example, processing and cropping images as described above with respect to the operations in routine 900 of
A synchronization application (not shown) also resides on system 1102 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in the non-volatile storage 1168 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may also be loaded into the memory 1162 and run on the mobile computing device 1150.
The system 1102 has a power supply 1170, which may be implemented as one or more batteries. The power supply 1170 might further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries.
The system 1102 may also include a radio 1172 (i.e., radio interface layer) that performs the function of transmitting and receiving radio frequency communications. The radio 1172 facilitates wireless connectivity between the system 1102 and the “outside world,” via a communications carrier or service provider. Transmissions to and from the radio 1172 are conducted under control of OS 1164. In other words, communications received by the radio 1172 may be disseminated to the application 1167 via OS 1164, and vice versa.
The radio 1172 allows the system 1102 to communicate with other computing devices, such as over a network. The radio 1172 is one example of communication media. The embodiment of the system 1102 is shown with two types of notification output devices: the LED 1180 that can be used to provide visual notifications and an audio interface 1174 that can be used with speaker 1130 to provide audio notifications. These devices may be directly coupled to the power supply 1170 so that when activated, they remain on for a duration dictated by the notification mechanism even though processor 1160 and other components might shut down for conserving battery power. The LED 1180 may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. The audio interface 1174 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to speaker 1130, the audio interface 1174 may also be coupled to a microphone (not shown) to receive audible (e.g., voice) input, such as to facilitate a telephone conversation. In accordance with embodiments, the microphone may also serve as an audio sensor to facilitate control of notifications. The system 1102 may further include a video interface 1176 that enables an operation of on-board camera 1140 to record still images, video streams, and the like.
A mobile computing device implementing the system 1102 may have additional features or functionality. For example, the device may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated in
Data/information generated or captured by the mobile computing device 1150 and stored via the system 1102 may be stored locally on the mobile computing device 1150, as described above, or the data may be stored on any number of storage media that may be accessed by the device via the radio 1172 or via a wired connection between the mobile computing device 1150 and a separate computing device associated with the mobile computing device 1150, for example, a server computer in a distributed computing network such as the Internet. As should be appreciated such data/information may be accessed via the mobile computing device 1150 via the radio 1172 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
Content developed, interacted with, or edited in association with the application 1200 may be stored in different communication channels or other storage types. For example, various documents may be stored using a directory service 1222, a web portal 1224, a mailbox service 1226, an instant messaging store 1228, or a social networking site 1230.
The application 1200 may use any of these types of systems or the like for enabling data utilization, as described herein. The server 1220 may provide the application 1200 to clients. As one example, the server 1220 may be a web server providing the application 1200 over the web. The server 1220 may provide the application 1200 over the web to clients through the network 1215. By way of example, the computing device 10 may be implemented as the computing device 1203 and embodied in a personal computer, the tablet computing device 1205 and/or the mobile computing device 1210 (e.g., a smart phone). Any of these embodiments of the computing devices 1203, 1205 and 1210 may obtain content from the store 1216.
Various embodiments are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products. The functions/acts noted in the blocks may occur out of the order as shown in any flow diagram. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The description and illustration of one or more embodiments provided in this application are not intended to limit or restrict the scope of the invention as claimed in any way. The embodiments, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed invention. The claimed invention should not be construed as being limited to any embodiment, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate embodiments falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claimed invention.