Electronic devices such as desktop computers, laptop computers, and various other types of computing devices provide information to users. The present disclosure relates generally to the field of such electronic devices, and more specifically, to electronic devices that may facilitate the capture, retrieval, and use of mobile access information and/or other data.
Referring to
As shown in
Display 18 may comprise a capacitive touch screen, a mutual capacitance touch screen, a self capacitance touch screen, a resistive touch screen, a touch screen using cameras and light such as a surface multi-touch screen, proximity sensors, or other touch screen technologies, and so on. Display 18 may be configured to receive inputs from finger touches at a plurality of locations on display 18 at the same time. Display 18 may be configured to receive a finger swipe or other directional input, which may be interpreted by a processing circuit to control certain functions distinct from a single touch input. Further, a gesture area 30 may be provided adjacent to (e.g., below, above, to a side, etc.) or be incorporated into display 18 to receive various gestures as inputs, including taps, swipes, drags, flips, pinches, and so on. One or more indicator areas 39 (e.g., lights, etc.) may be provided to indicate that a gesture has been received from a user.
According to an exemplary embodiment, housing 12 is configured to hold a screen such as display 18 in a fixed relationship above a user input device such as user input device 20 in a substantially parallel or same plane. This fixed relationship excludes a hinged or movable relationship between the screen and the user input device (e.g., a plurality of keys) in the fixed embodiment.
Device 10 may be a handheld computer, which is a computer small enough to be carried in a hand of a user, comprising such devices as typical mobile telephones and personal digital assistants, but excluding typical laptop computers and tablet PCs. The various input devices and other components of device 10 as described below may be positioned anywhere on device 10 (e.g., the front surface shown in
According to various exemplary embodiments, housing 12 may be any size, shape, and have a variety of length, width, thickness, and volume dimensions. For example, width 13 may be no more than about 200 millimeters (mm), 100 mm, 85 mm, or 65 mm, or alternatively, at least about 30 mm, 50 mm, or 55 mm. Length 15 may be no more than about 200 mm, 150 mm, 135 mm, or 125 mm, or alternatively, at least about 70 mm or 100 mm. Thickness 17 may be no more than about 150 mm, 50 mm, 25 mm, or 15 mm, or alternatively, at least about 10 mm, 15 mm, or 50 mm. The volume of housing 12 may be no more than about 2500 cubic centimeters (cc) or 1500 cc, or alternatively, at least about 1000 cc or 600 cc.
Device 10 may provide voice communications functionality in accordance with different types of cellular radiotelephone systems. Examples of cellular radiotelephone systems may include Code Division Multiple Access (CDMA) cellular radiotelephone communication systems, Global System for Mobile Communications (GSM) cellular radiotelephone systems, third generation (3G) systems such as Wide-Band CDMA (WCDMA), or other cellular radio telephone technologies, etc.
In addition to voice communications functionality, device 10 may be configured to provide data communications functionality in accordance with different types of cellular radiotelephone systems. Examples of cellular radiotelephone systems offering data communications services may include GSM with General Packet Radio Service (GPRS) systems (GSM/GPRS), CDMA/1xRTT systems, Enhanced Data Rates for Global Evolution (EDGE) systems, Evolution Data Only or Evolution Data Optimized (EV-DO) systems, Long Term Evolution (LTE) systems, etc.
Device 10 may be configured to provide voice and/or data communications functionality in accordance with different types of wireless network systems. Examples of wireless network systems may further include a wireless local area network (WLAN) system, wireless metropolitan area network (WMAN) system, wireless wide area network (WWAN) system, and so forth. Examples of suitable wireless network systems offering data communication services may include the Institute of Electrical and Electronics Engineers (IEEE) 802.xx series of protocols, such as the IEEE 802.11a/b/g/n series of standard protocols and variants (also referred to as “WiFi”), the IEEE 802.16 series of standard protocols and variants (also referred to as “WiMAX”), the IEEE 802.20 series of standard protocols and variants, and so forth.
Device 10 may be configured to perform data communications in accordance with different types of shorter range wireless systems, such as a wireless personal area network (PAN) system. One example of a suitable wireless PAN system offering data communication services may include a Bluetooth system operating in accordance with the Bluetooth Special Interest Group (SIG) series of protocols, including Bluetooth Specification versions v1.0, v1.1, v1.2, v2.0, v2.0 with Enhanced Data Rate (EDR), as well as one or more Bluetooth Profiles, and so forth.
Referring now to
In various embodiments, memory 42 may be configured to store one or more software programs to be executed by processor 40. Memory 42 may be implemented using any machine-readable or computer-readable media capable of storing data such as volatile memory or non-volatile memory, removable or non-removable memory, erasable or non-erasable memory, writeable or re-writeable memory, and so forth. Examples of machine-readable storage media may include, without limitation, random-access memory (RAM), dynamic RAM (DRAM), Double-Data-Rate DRAM (DDRAM), synchronous DRAM (SDRAM), static RAM (SRAM), read-only memory (ROM), programmable ROM (PROM), erasable programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), flash memory (e.g., NOR or NAND flash memory), or any other type of media suitable for storing information.
In one embodiment, processor 40 can comprise a first applications microprocessor configured to run a variety of personal information management applications, such as email, a calendar, contacts, etc., and a second, radio processor on a separate chip or as part of a dual-core chip with the application processor. The radio processor is configured to operate telephony functionality.
Device 10 comprises a receiver 38 which comprises analog and/or digital electrical components configured to receive and transmit wireless signals via antenna 22 to provide cellular telephone and/or data communications with a fixed wireless access point, such as a cellular telephone tower, in conjunction with a network carrier, such as, Verizon Wireless, Sprint, etc. Device 10 can further comprise circuitry to provide communication over a local area network, such as Ethernet or according to an IEEE 802.11x standard or a personal area network, such as a Bluetooth or infrared communication technology.
Device 10 further comprises a microphone 36 (see
Device 10 further comprises a location determining application, shown in
Device 10 may be arranged to operate in one or more location determination modes including, for example, a standalone mode, a mobile station (MS) assisted mode, and/or an MS-based mode. In a standalone mode, such as a standalone GPS mode, device 10 may be arranged to autonomously determine its location without real-time network interaction or support. When operating in an MS-assisted mode or an MS-based mode, however, device 10 may be arranged to communicate over a radio access network (e.g., UMTS radio access network) with a location determination entity such as a location proxy server (LPS) and/or a mobile positioning center (MPC).
Referring now to
Various embodiments disclosed herein generally relate to capturing visual data (e.g., data displayed on a display screen, data viewed while using a camera I camera application, etc.), storing the data, and providing an easy and intuitive way for users to retrieve and/or process the data via either a desktop computer, mobile computer, or other computing device (e.g., by way of an “electronic corkboard,” a “card deck,” or similar retrieval system). The captured data (e.g., “mobile access information,” “mobile access data,” etc.) may be data the user is able to see (e.g., via a display, camera, etc.), and/or data where it is likely the user may need or wish to view the data at a later time (e.g., directions, a map, a recipe, instructions, a name, etc.). However, the user may not want to permanently store the data or have to re-open an application such as a mapping program, etc., at a later date in order to access the data. As such, mobile access information may be information for which the user typically only need to view a “snapshot” of visual data, such as an intersection on a map, a recipe, information related to a parking spot in a parking structure, etc.
Referring to
Referring to
Device 10 and/or computing device 50 may be configured to enable a user to select all or a portion of screen data provided on a display (step 74). In some embodiments, a designated “hot key” or “hot button” may be preprogrammed to enable a user to capture all of the displayed data or information. Alternatively, a user may use a mouse, touchscreen (e.g., utilizing one or more fingers, a stylus, etc.), input buttons, or other input device to identify a portion of the information or data being displayed. It should be noted that images may be captured via device 10 in a variety of ways, including via a camera application, by user interaction with a touchscreen, by download from a remote source such as a remote server or another mobile computing device, etc.
In response to a user identifying all or a portion of data or information to be captured, device 10 and/or device 50 stores the data (e.g., as an image file such as JPEG, JIFF, PNG, etc.) (step 76). In some embodiments, the captured data is stored as an image file regardless of the type of underlying data displayed (e.g., image files, messaging data such as emails, text messages, etc., word processing documents, spreadsheets, etc.). According to other embodiments, the data may be stored using other file types. Multiple image files may be stored in a single location (e.g., a “mobile access folder,” an “electronic corkboard,” etc.), that may be represented, for example, by an icon or other visual indicator on a user's main screen or other screen display (e.g., a “desktop,” a “today” screen, etc.).
In some embodiments, in response to a user saving an image (e.g., on a desktop PC such as device 50), the image is automatically (e.g., in response to or based on saving and/or capturing the image, without requiring input from a user, etc.) transmitted for downloading to a second device or other remote location (e.g., a mobile device such as device 10, a server such as server 54, etc.) (step 78). For example, in one embodiment, images may be transmitted (e.g., via Bluetooth, Wi-Fi, or other wireless or wired connection) from device 50 to device 10 immediately, or immediately upon saving. Alternatively, device 50 may transmit the image to a server such as server 54, such that device 10 may query server 54 to request that the image(s) be transmitted from server 54 to device 10. In the case where an image is captured using device 10, further transfer of the data may not be necessary as the data is already on the user's mobile device. In other embodiments, device 10 may transmit (either automatically or in response to a user input) an image to device 50, server 54, or another remote device after capturing the image.
According to one embodiment, in addition to capturing and saving screen images as image files, other data may be stored, or other types of data storage may be utilized. For example, in one embodiment, one or more links to the original data (e.g., a web page, an email, word processing document, etc.) may be generated and saved in order to enable a user to access the original data if desired. Device 10 and/or device 50 may further be configured to store metadata associated with image files, such as data type, text columns, graphic images or regions, and the like, for later use by device 10 and/or device 50.
Referring now to
Referring to
Referring further to
Referring further to
In one embodiment, images 120 may be thumb-nail sized images representing larger images, such that upon receiving a selection of one of images 120 (e.g., via a tap, input key, etc.), a full-sized image is displayed (step 86) (see
It should be noted that the various embodiments discussed herein provide many benefits to users. For example, one or more of the features described herein may be implemented as part of a desktop application that permits easy capture of data/information and transfer of the data/information to a mobile device. Metadata may also be stored that may identify the type or source of the underlying data and/or enable an image to be converted back to the original data type. Metadata may also enable smart zooming/snapping to appropriate areas of images. Furthermore, saved images can be easily browsed by way of a user interface that utilizes fast image searching/retrieval/deletion features. Further yet, according to various exemplary embodiments, device 10 may provide data in a “context aware” fashion such that images may be based on contextual factors such as time of day, day of year, location of the user and so on (e.g., such that “map” images are displayed first when a user is located with his or her car, etc.). Additionally, users may set up one or more accounts (e.g., password-protected accounts) and users may direct images to specific accounts (e.g., for uploading).
As discussed above, various types of data from various data sources may be captured utilizing techniques described in one or more of the various embodiments described herein. Referring to
In some embodiments, a single application (e.g., a camera application) running on processing circuit 46 of device 10 may enable a user to provide both image capture commands and image processing commands either pre or post capture (e.g., one or both of the image capture command(s) and the image processing command(s) may be received prior to a user taking a picture with device 10). Consolidating these functions into a single application may minimize the number of inputs that are required to direct device 10 to properly capture an image and later process and take action regarding the image, such as uploading the image to a remote site, utilizing one or more recognition technologies (e.g., bar code recognition, facial recognition, text I optical character recognition (OCR), image recognition, facial recognition, and the like), and so on.
According to various exemplary embodiments, a number of different recognition technologies may be utilized by device 10, both to receive and execute commands provided by users. For example, device 10 may utilize voice recognition technology to receive image capture and/or image processing commands from a user. Any suitable voice recognition technology known to those skilled in the art may be utilized. According to alternative embodiments, device 10 may be configured to display a menu of command options (e.g., image capture command options, image processing command options, etc.) to a user, and the user may be able to select one or more options utilizing an input device such as a touchscreen, keyboard, or the like. Other means of receiving commands from users may be used according to various other exemplary embodiments.
According to various exemplary embodiments, a number of different image capture commands may be received by device 10. For example, the image capture commands may include a “business card” command, which may indicate to device 10 that a user is going to take a photograph of a business card. Another command may be a “barcode” command, which indicates to device 10 that a user is going to take a photograph of a barcode (e.g., a Universal Product Code (UPC) symbol, barcodes associated with product prices, product reviews, books, DVDs. CDs, catalog items, etc.). A wide variety of other image capture commands may be provided by users and received by device 10, including a “macro” command (indicating that a close-up photograph will be taken). Other image capture commands may be utilized according to various other embodiments, and the present application is not limited to those commands discussed herein.
Similarly, according to various exemplary embodiments, a number of different image processing commands may be received by device 10. For example, the image processing commands may include a “translate” command, which may indicate to device 10 that a user wishes for a portion of text (e.g., a document, web page, email, etc.) to be translated (e.g., into a specified language such as English, etc.). Another image processing command may be an “Upload” command, which may indicate to device 10 that the user wishes to upload the picture to a website, etc. (e.g., Flickr, facebook, yelp, etc.). A wide variety of other image processing commands may be provided by users and received by device 10, including a “restaurant” command (e.g., to recognize the logo or name of a restaurant and display a search option, a restaurant home page, a map, etc.); a “guide” command (e.g., to recognize a landmark and display tourist information such as a tour guide, etc.); a “people”/“person” command (e.g., to utilize facial recognition to identify a person and cross-reference a contacts directory on device 10, a web-based database, etc.); a “safe” or “wallet” command (e.g., to encrypt an image and/or limit access using a password, etc.); a “document” command (e.g., to utilize text recognition etc.); a “scan” command (e.g., to convert an image to a PDF file, etc.); a “search” command (e.g., to utilize text recognition and subsequently perform a search (e.g., a global search, web- based search, etc.) based on identified text, etc.), and the like. Other image processing commands may be utilized according to various other embodiments, and the present application is not limited to those commands discussed herein. Each image processing command directs device 10 to take particular action(s) (i.e., “process”) captured images.
In some embodiments, image capture commands may be definable by a user of device 10, such that a user may define various parameters of a camera application (e.g., data type, desired targeting aids, orientation, etc.) and associate the parameters with a particular image capture command. Similarly, device 10 may be configured to enable users to define image processing commands. For example, device 10 may enable a user to configure a “contacts” command that directs processing circuit 46 to upload data (e.g., name, address, phone, email, etc.) captured from a business card to a contacts application running on device 10. Furthermore, the image processing commands and image capture commands may be combined into a single command, such as a single word or phrase to be voiced by a user (e.g., such that the phrase “business card” acts to instruct device 10 to provide a proper targeting aid for a business card, capture the text on the business card, and save the contact information to a contacts application).
Referring to
According to one embodiment, a command such as “corkboard” may be used to indicate that a captured image should be saved in accordance with the features described in the various embodiments of
Referring now to
Referring now to
In one embodiment, processing circuit 46 may be configured to predict or determine the image capture options based on a user's past picture-taking behavior (e.g., by tracking the types of pictures the user takes most often, such as pictures of people, bar codes, business cards, etc., the camera settings utilized by a user, location of the user, and so on). Alternatively, processing circuit 46 may utilize one or more recognition technologies to process a current image being viewed via camera 28 and predict what image capture commands may be most appropriate. For example, processing circuit 46 may determine that the current image is of a text document, and that a text recognition mode may be most appropriate. Device 10 may then suggest a text recognition command to the user. In yet another embodiment, device 10 may be configured to receive user preferences that define what image capture commands should be provided. For example, a user may specify that he or she always wants a “people” command, a “business card” command, and a “text” command displayed.
Referring further to
It should be noted that the various embodiments disclosed herein may be utilized alone, or in any combination, to suit a particular application. For example, the various features described with respect to capturing and processing photographs or images in
Various embodiments disclosed herein may include or be implemented in connection with computer-readable media configured to store machine-executable instructions therein, and/or one or more modules, circuits, units, or other elements that may comprise analog and/or digital circuit components configured or arranged to perform one or more of the steps recited herein. By way of example, computer-readable media may include RAM, ROM, CD-ROM, or other optical disk storage, magnetic disk storage, or any other medium capable of storing and providing access to desired machine-executable instructions.
While the detailed drawings, specific examples and particular formulations given describe exemplary embodiments, they serve the purpose of illustration only. The hardware and software configurations shown and described may differ depending on the chosen performance characteristics and physical characteristics of the computing devices. The systems shown and described are not limited to the precise details and conditions disclosed. Furthermore, other substitutions, modifications, changes, and omissions may be made in the design, operating conditions, and arrangement of the exemplary embodiments without departing from the scope of the present disclosure as expressed in the appended claims.
This application is a divisional of co-pending, commonly assigned, patent application Ser. No. 12/732,077 entitled, “SYSTEM AND METHOD FOR DATA CAPTURE, STORAGE, AND RETRIEVAL”, filed on Mar. 25, 2010, the disclosure of which is hereby incorporated herein by reference.
Number | Date | Country | |
---|---|---|---|
Parent | 12732077 | Mar 2010 | US |
Child | 15726923 | US |