A large and growing population of users employs various mobile electronic devices or other touch devices with a variety of screen real estate configurations to perform an ever-increasing number of functions. Among these functions, as one example, is consuming and analyzing large data sets, such as filtering them to reduce the number of records returned based on predetermined filters, as one example. Among these electronic or touch devices are smartphones, phones, PDAs, gaming devices, tablets, commercial devices for industrial applications, electronic book (eBook) reader devices, portable computers, portable players, and the like.
The way that users interact with mobile devices continues to evolve. Often, users engage with mobile devices while in transit, thereby reducing their time and attention to interacting with the mobile device in order to accomplish tasks. Concomitantly, the mobility of users has increased the use of easily portable and readily available devices, such as smartphones, and tablets, which may present limited screen real estate. In addition, for the purpose of consuming and analyzing large data sets, where some of the processing includes the transmission of records and their related fields over networks between the device and remote servers, has created a need to reduce the number of user interactions in order to refine data sets. Thus, there is a need for new techniques and interfaces for electronic devices that will improve the refinement of filtered results and user interactions.
The detailed description is described with reference to the accompanying figures. In the figures, the left-most digit(s) of a reference number identifies the figure in which the reference number first appears. The use of the same reference numbers in different figures indicates similar or identical components or features.
Described herein is a mobile or other touch device having a user interface to support a dynamic refinement of filtered results in a search and discovery process. The mobile or touch device may be implemented in various form factors, such as a smartphone, portable digital assistant, a tablet computing device, gaming device, or electronic reader device, etc. This disclosure further describes example graphical user interfaces (UIs) or user interfaces) that may be used to interact with an electronic device, including touch-screen options, to transition the device between multiple modes during a dynamic refinement of filtered results.
This disclosure describes, in part, electronic or touch devices, interfaces for electronic or touch devices and techniques that improve the ability of the computer to display information and enhance user interaction experiences. For instance, this disclosure describes example electronic devices that include a user interface implementing a filter mode in which previously generated search results of records are visually presented and dynamically refined to smaller sets of records. Users may interact with the device using touch-screen interactions, representing one example of a user interaction. The user interface may support a user in both entering filter options and applying them to records, while dynamically viewing and scrolling through the resulting filtered records, with multiple windows on the user interface remaining operable for user interaction and display of results at the same time.
In one illustrative example, the filtered results may be a visual identification field for each record in order to optimize the number of records that may be displayed within limited screen real estate. One example of a visual identification field is a thumbnail containing a photograph, drawing or other image or rendering associated with a record (hereinafter referred to as an “image” and/or “visual identification field (VIF)”). The screen real estate may dictate the number of images that may be displayed, and additional images may be revealed by the user scrolling through the displayed images presented in, for example, a banner configuration on the side or bottom/top of the mobile device display.
The mobile electronic device may operate in a filter mode to support the dynamic refinement of filtered results. In the filter mode, the user interface may display multiple windows, such as a results window (which can also be described as an interface or a portion of an interface) and a filter window (which can also be described as an interface or a portion of an interface), in which the user may dynamically interact with the filtered records within the results window, or by select filter options in the filter window. In this manner, the user may iteratively refine filtered records, and both interactions may be operable concurrently, with the filtered records being updated for the user experience at optimized speed to approximate a real-time basis. As a result, the user may dynamically refine the filtered results to arrive at a consideration set that is a manageable size of records for the user's search and discovery process.
In some examples, upon entering the filter mode, the search results may be generated for display to the user based solely on the visual identification field, with one example being an image (such as for example, a thumbnail), displayed as a single field associated with each record. Additional examples of visual identification fields are icons, numbers, colors, shapes or other textual or non-textual identification, or as a combination of one or more individual fields within the record or an associated record. In addition to the visual identification field, each record may also include additional fields, such as, for example, a title description, a summary description, a third party review field, etc.
In the filter mode, a user may interact with the user interface to select which filter options to apply to the records to produce the resulting filtered records. When, for example, an image is selected in the results window, the processing time is optimized for the mobile electronic device to send a transmission call to remote content server(s) in order to generate the filtered records and to respond with a reduced transmission of a single field per record, for example, the image field, back to the mobile device. The reduction in processing time based on the transmission of limited data may approach a real-time processing time for the user experience.
In one example, the filter mode may be entered after an initial search of records is displayed in a search mode of the mobile electronic device. To continue with this example, the user interface, while in search mode, presents an option for the user to select the filter mode after an initial search of the data set has produced search results. Or, in an alternative example, the filter mode and the techniques presented herein, may encompass the entirety of a search strategy so that a filter window (which also may be referred to as a search window in this example) is maintained throughout a search of records.
Starting the filter mode may initiate the rendering of multiple windows on the user interface with one window. One window may be a results window, presenting filtered records based on displaying image fields of the filtered records, in order to optimize the number and transmission speed of filtered records within the limited screen real estate. The results window may be limited due to limitations on screen real estate in displaying more than predefined sets of the image fields, such as for example, 12 records on the display at any given time. The results window may be operable to enable a user to scroll through image field results. Another window may be a filter window, presenting multiple predefined (or user-defined) filter options from which a user may select in order to refine the filtered records in the results window. Both the results window and the filter window may be maintained as operable during processing initiated based on user interaction with either window. When a user interacts with the filter window by selecting filter options, the results window automatically updates the filtered results in the results window and maintains the results window as active during subsequent user interaction with the filter window. In addition, when a user interacts with the results window, for example, by scrolling through the images representing the filtered records, the filter window remains active and available for the user to select further filter options, thereby iteratively refining the filtered records. As a result, the user may dynamically select filter options in the filter window as input to initiate processing of the results window, and concurrently view the results of the filter refinement in the results window without disabling the filter window.
At any point during the iterative refinement or, for example, when the filtered records number is a manageable results set, such as less than 20 or less than the number of images that may be viewed without scrolling on a given user interface (with the number dependent upon the individual users needs and standards in the search and discovery process, so that in some cases hundreds or even thousands of records may be considered to be a manageable results set), the filter mode may be exited by the user selecting one of the images in the results window. One example of how to execute the selection is, for example, executing a double-tap touch input on the image. Exiting the filter mode may return the mobile electronic device user interface to the search mode. This execution approach may result in closure of the filter window so that filter options are no longer available to the user. Further, a transition to the search mode may, in one example, initiate an expansion of the results window to occupy the full screen real estate, thereby providing more data about the filtered records. For example, the filtered records may expand to a presentation format of the search results prior to the filter mode or, in the example presented above, when the mobile electronic device prior to the filter mode, was in the search mode. This may include, for example, an expansion of the number of fields associated with the filtered records. More specifically, for example, title summary and detailed description fields of the record may be added to the display, as well as other fields relevant to a given search and discovery subject matter, such as specification, availability in inventory, etc., or other fields relevant to the subject matter.
A return to or initiation of the search mode may be undertaken when the considered set of records is reduced to a manageable number so that the user benefits from displaying additional information about the records resulting from the use of the filter mode. In alternative examples, the filter mode may be exited based on a toggle button, selection field or any type of button or interaction component (including verbal or audio interactions without the user contacting the interface display, etc., for example) to indicate a user's selection of exiting the filter mode. In one alternative, for instance, exiting the filter mode may for example, expand the results window but maintain a portion of the filter window in order to provide the user with the option of viewing the previous filter option selections, or to enable a return to one or more iterations of previous filtered results.
According to the techniques described herein, and by way of example and not limitation, the starting state of the device may be the search mode, which is maintained until the user executes a filter selection on the user interface, such as a button labeled “Filter.” In other examples, users may interact with the device through other touch interactions (e.g., tap, multi-tap, touch, touch and hold, swipe, and so forth) or through audible verbal interactions, etc. Upon execution of a filter button, the user interface may transition to the filter mode and the user may be visually informed of the search results extant upon entry into the filter mode based on the display of the images in the results window of the filter mode. In addition, the filter mode may trigger rendering the filter window to display the filter options in another area the user interface. In the filter mode, in some examples, the results window and filter window are operable for touch screen entry by the user and processing may occur to approximate on a near-real-time basis.
These and numerous other aspects of the disclosure are described below with reference to the drawings. The electronic devices, interfaces for electronic devices, and techniques for interacting with such interfaces and electronic devices as described herein, may be implemented in a variety of ways and by a variety of electronic devices. Among these electronic or touch devices are smartphones, phones, PDAs, gaming devices, tablets, commercial devices for industrial applications, electronic book (eBook) reader devices, portable computers, portable players, and the like.
The display 102 may be formed using any of a number of various display technologies (e.g., LCD, OLED, eInk, and so forth). The display 102 may also include a touch screen operated by a touch sensor 104 and other buttons and user interaction features on smartphones. Other user interaction techniques may be used to interact with the display 102 (e.g., pen, audio, keyboard, and so forth), as well as interactions based on the length of time the user 103 interacts with the screen, such as, for example, depressing a physical or virtual button for a predetermine length of time, etc.
The device 101 shown in
During the filter mode 111 of operation for the device 101, a set of records 250 created in the search mode 113 may be made available for application of a filter operation. For example, the user 103 entered search criteria in the search entry section 126 that may be applied in the filter mode 111 to the records 250 to produce a set of records 114. The results window provides for the presentation of the records 114. The results window may present the filtered records 114 based on a single visual identification field. In one example, the single visual identification field 115 may be an image, as for example described above as a thumbnail (hereinafter referred to as an image visual identification field (VIF) or “image VIF”).
In this way, the results window may optimize a visual identification of the filtered records 114 within the screen real estate based on a visual presentation of micro content with which a user may efficiently visually review the filtered records 114 within the limitations of the screen real estate. The use of micro content for a visual presentation also supports user interaction with the images VIF 115 based on the device 101 being held in the palm of a user's 103 hand, shown in
In addition, the
As illustrated, the memory 108 may store a user interface (UI) module 116, including a filter mode module 122 and a search mode module 124, a filtered records database 152 (to store filtered records 114), an images database 154 (to store images VIF 115), a filter options database 156 (to store filter options 118), a filter options selection(s) database 156 (to store a filter option selection(s) 120) an images and filter option selection(s) database 159 (to store images and filter option selection(s) 120).
The UI module 116 may present various user interfaces on the display 104 to implement the results window and the filter window 112 to support a dynamic refinement of filtered results in a search and discovery process. For instance, the UI module 116 may initiate the filter mode module 122 to trigger the results window module 124 in order to process the results window and the filter window module 126 in order to process the filter window 112. In one example, processing of the results window module 124 may include managing the rendering of the results window and the interaction by the user 103 with the window, such as scrolling or selection of the images VIF 115(1)-(x). The images VIF 115(1)-(x) on the user interface 104 may comprise an interactive list that is scrollable by the user 103 of the device 101, such as by touch gestures on the display 102. In addition, there are any number of gestures which can be applied to effectuate user interaction with the results window 110, such as for example a scrolling gesture, a swiping gesture, a dynamic sliding gesture, a selection of one of the images VIF 115(1)-(x), or a physical or audio gesture without the user contacting the display screen.
This is shown in
In addition, processing of the filter window module 126 may include rendering the filter window 112, its contents such as the field option(s) 118 and field option selection(s) 119 and the interaction by the user 103 with the window 112, such as selecting filter option selection(s) 119 (“filter option selections” may also be referred to as “filter options” herein, with both terms describing a list of items which may be selected by the user in the filter window 112).
The UI module 116 may further initiate the search mode module 128 to process the search mode 113 when the user 103 interacts with the user interface 104 of the device 101 to activate a search operation or to initiate the search mode 113, as further described regarding
As illustrated, the computing architecture 105 of device 101 may further include an operating system 146, a network interface 145 and the touch sensor 104 (as described above). The operating system 146 functions to manage interactions between and requests from different components of the device 101. The network interface 145 serves as the communication component for the device 101 to interact with a network(s) 160. The network(s) 160 may facilitate communications and/or interactions via any type of network, such as a public wide-area-network (WAN, e.g., the Internet), which may utilize various different technologies including wired and wireless technologies. The network interface 145 may communicate with content server(s) 150 through the network(s) 160 to send and receive requests that support the search and discovery process.
Embodiments may be provided as a computer program product including a non-transitory machine-readable medium having stored thereon instructions (in compressed or uncompressed form) that may be used to program a computer (or other electronic device) to perform processes or methods described herein. The machine-readable storage medium may include, but is not limited to, hard drives, floppy diskettes, optical disks, CD-ROMs, DVDs, read-only memories (ROMs), random access memories (RAMs), EPROMs, EEPROMs, flash memory, magnetic or optical cards, solid-state memory devices, or other types of media/machine-readable medium suitable for storing electronic instructions. Further, embodiments may also be provided as a computer program product including a transitory machine-readable signal (in compressed or uncompressed form). Examples of machine-readable signals, whether modulated using a carrier or not, include, but are not limited to, signals that a computer system or machine hosting or running a computer program may be configured to access, including signals downloaded through the Internet or other networks.
In some examples, the computer-readable media 220 may store a filter mode module 222 and a search mode module 224, as well as one or more network interface(s) 230. The modules are described in turn. The content server(s) 150 may also have access to a records database 251, a filtered record(s) database 252, an images database 254, a filter options database 256, filter option selection(s) database 258 and image and filter option selection(s) database 259. The modules may be stored together or in a distributed arrangement. The network interface(s) 248, similar to the network interface(s) 144 for device 101 serves as the communication component for the content server(s) 150 to interact with the network(s) 160. The network interfaces) 248 may communicate with the device 101 through the network(s) 160 to receive requests and provide responses that support the search and discovery process.
The filter mode module 222 and the search mode modules 224 are now described. The filter mode module 222 processes the filtering operations at the content server(s) 150 based on the receipt of calls from the device 101 during the initiation of the filter mode 111 or during operation of the device 101 in the filter mode 111. The filter mode module 222 may process the receipt of input from the user 103 on the device 101 to initiate the filter mode 111. The filter mode 111 initialization operations may include: generating the initial set of records 114 based on the records 250 to transmit to the device 101 for display on the results window, and generating the filter options 118 based on the records 250 to transmit to the device 101 for display within the filter window 112. The filter mode module 222 may also continue processing filter operations based on the receipt by the processor(s) 220 of the user 103 input while the device 101 is in the filter mode 111.
The filter mode module 222 processing will now be described. In one example, the filter mode 111 is activated from the search mode 113 so that when the filter mode module 222 initiates processing, the records 250 in the records database 251 already have been populated based on a search process. In alternative embodiments, the filter mode 111 may be activated based on other approaches, as is further described regarding the search mode 111 and the search mode module 224 below. To continue with one illustrative example, as shown in
Upon receipt by the device 101 of the user's 122 selection of the filter mode 111 from the search mode 113, the device 101 sends a call to the content server(s) 150 to initialize the filter mode 111. The processor(s) 206 activates the filter mode module 222 to store the records 250 generated by the previous the search mode 113 as the filtered records 114 in the filtered records database 254.
Once the search mode 113 results are displayed, a “Filter” button 308 may be presented to enable the user 103 to select the filter mode 111 (as shown and described regarding
After the initial setup of the filter mode 111, based on execution of the filter mode module 222, the filter mode module 222 may then continue processing filter operations based on receipt by the processor(s) 206 of the user 103 input on the device 101 while in the filter mode 111. In one example, in which a user 103 selection of any of the filter options 118(1)-(x) activates a filter operation, such as the user 103 selecting filter option 118(3) “Select Style” and 119(9) “Knee Band,” the device 101 may send a call to the content server(s) 150 to process the filter operation. Upon receipt at the processor(s) 210, the filter mode module stores the filter option selection 120(9) in the filter option selections database 257 and applies the filter option selection 120(9) to the records 250 to produce the filtered records 114 and to generate and transmit back to the device 101 the image VIF 115(1)-(x) for each of the records 114 for display in the results window (as well as storage in the images database 254). In addition, the total number of records 114 is calculated and transmitted to the device 101 for presentation in the results statistics section 125. By limiting for transmission to the device 101 the results of the filter mode module 222 to the image VIF 115 field of records 114, the data for transmission is reduced and, therefore, the transmission speed is increased for receipt of the images VIF 115(1)-(x) at the device 101.
In another example, the filter option selection(s) 120(1)-(x) may be iteratively applied based on each successive user 103 selection of filter options selection(s) 119(1)-(x) so that, for example, each filter option selection 120(1)-(x) produces new filtered records 114, which is smaller than the preceding filtered records 114 until, continuing with the example, the filtered records 114 total number of records is small enough that the images VIF 115(1)-(x), representing the filtered records 114, may be displayed in the results window without scrolling. In this manner, there may be multiple filter option selection(s) 120(1)-(x) which are aggregated and transmitted to the content server(s) 150 over the course of multiple user 103 selections of filter options 118(1)-(x). Upon receipt of each filter option selection(s) 120(1)-(x), the filter mode module 222 stores the filter option selection(s) 120(1)-(x) in the filter option selections database 257, accumulating multiple filter options selection(s) 119(1)-(x) over multiple user 103 selections and calls to the content server(s) 150 and applies the filter options selection(s) 119(1)-(x) to the filtered records 114 stored in the database from the previous filter operation. With each call to the content server(s) 150, the filter mode module 222 generates a new set of filtered records including the image VIF 115(1)-(x) for each of the records 114 and a total number of records for presentation in the results statistics section 125 on the device 101. In this manner, the total number of filtered records 114 displayed in the results statistics section 125 becomes smaller with every additional filter option selection(s) 120(1)-(3). In a related example, the filtering processing as the total number of records 114 becomes small enough to be processed at the device 101 may be executed by the computing architecture 105 at the device 101 so that additional speed is gained in obviating a call to the content server(s) 150. In the
In a related example, the filter mode module 222 may process iterative user 103 selections in which the device 101 enables the user 103 to both select or clear filter option selection(s) 119(1)-(x). There are multiple approaches to enabling these options, such as adding a clear filter option 118(x) which may clear all selected filter option selection(s) 119(1)-(x), adding a clear filter option selection 120(x) which may clear the selected filter options for the associated filter option 118(x), providing a button or other interactive approach for each filter option selection 120(1)-(x) that may be toggled for selection or clearing the selection, etc. In the event that the user 103 deselects filer option selection(s) 119(1)-(x), the filter mode module 222 processing may result in the filtered records 114 total number increasing relative to the preceding result set. Therefore, the filter mode module 222 iterative processing of filter operations may also increase the filtered records 114.
In other examples, the device 101 may operate to initiate the filter mode 111 upon activation by the user 103 of the “Refine” button 108 (shown in
In addition, the content server(s) 150 may also support the storage of a data structure composed of multiple data fields from separate databases, such as the image VIF 115 field of the record 114 (from the images database 254) combined with the filter option selection(s) 120(1)-(x) (from the filter option selections database 260) associated with the image VIF 115. The data structure, hereinafter referred to as images and filter option selections 120, may be stored in separate database(s) in either or both of the content server(s) 150 (not shown) or the device 101 (shown in
Filter mode module 222 processing to reduce the filtering operation result for filtered records 114 to the single data field of the image VIF 115(1)-(x) for transmission to the device 101 increases the transmission speed of the results to the device 101. In addition, the reduced data of the image VIF 115 for display increases the amount of data that may be displayed for the user 103.
The search mode module 224 processes the search operations at the content server(s) 150 based on the receipt of calls from the device 101 during the search mode 112. Upon receipt by the processor(s) 206 of a search operation while in the search mode 112, the search mode module 224 applies the search entry in the search entry section 128, such as, for example, “Knee Brace” as shown in
Upon completion of a search operation in the search mode 113, the records 250 are displayed on the device 101, such as is shown in
In this illustrative example, the filter mode 111 is triggered based on the device 101 operating in the search mode 113. The device 101 also may transition from the filter mode 111 to the search mode 113. One example of one of the multiple approaches described herein for this transition is the user 103 selecting one of the images VIF 115 in the results window. The device 101 may then send a call to the content server(s) to request a transition from the filter mode 111 to the search mode 113. Upon receipt by the processor(s) 206 of the user's activation of the search mode 113, the search mode module 224 may process the filtered records 114 to identify additional fields 116, 117, etc., associated with each filtered record 114 and transmit the filtered records 114 including the additional fields to the device 101 for presentation in the search mode 113 on the display 102.
In other examples of the use of the content server(s) 150, the filtered records 114, including all records fields 115, 116, 117, etc., may be transmitted back to the device 101 and then the filter mode module 136 of the computing architecture 105 may process the filtered records 242 to identify for display the images VIF 115. In addition, the filter option selection(s) 120 need not be processed and appended to the filtered records 114. Rather, the filter option selection(s) 120 may be maintained based on the user 103 input at the device 101 for retrieval and processing by the filter mode module 138 locally. In other examples, the data structures of the filtered records 114, filter option selection(s) 120 and images VIF 115 may be stored at both or either of the device 101 and the content server(s) 150. Where they are stored at the content server(s) 150, the data may be transmitted for use by the device 101 with reduced transmission speed based on transmitting a limited amount of data, such as the images VIF 115. Upon receipt of the images VIF 115 at the device 101, the filter mode module 134 may associate the images with the filter option selection(s) 120 entered by the user 103 on the display 102 of the device 101, thereby combining transmitted data and the images VIF 115, with local data, the filter option selection(s) 120, to reduce transmission speed. In some examples, the data may be stored at both the device 101 and the content server(s) 150 and the processing may occur both locally, at the device 101, or remotely based on a call to the content server(s) 150. The processing distribution may also be a function of the volume of the filtered records 114 so that as the volume of these records is reduced such that the processing capacity at the computing architecture 105 is fast enough to support dynamic processing, then the processing may be executed by the filter mode module 134 at the device 101. The processor(s) 130 and operating system 140 may manage the distribution of processing, local storage at the device 101 and/or remote storage at the content server(s) 150 to optimize transmission speeds depending upon the volume of data being processed and transmitted.
In the starting UI display 310 of the filter mode 111, the display 310 presents the two primary windows: the results window and the filter window 112. For this example, the results window is on the left vertical side of the device 101. The records 114 which resulted from the search inquiry executed in UI display 302 continue to be displayed. However, the presentation of the records 114 has transitioned from the display in the search mode 113 to the display in the filter mode 111 based on reducing the number of fields rendered for each record 114 to the images VIF 115(1)-(5) corresponding to each of the records 114. With the use of solely a visual identification field 115, and the screen real estate occupied by the images VIF 115, a larger number of total records 114 may be displayed than in the search mode. More particularly, in this example, the UI display 310 results window displays 5 records 114(1)-(6) based on the single field of a visual identification images VIF 115(1)-(6). In this manner, and in contrast to the search mode 112 where records 114(1)-(3) are displayed, the filter mode 111 provides an increase in the volume of images VIF 115(1)-(6) (one for each record 114(1)-(6)) displayed with UI interface 310. In addition, the reduction in the fields for each record 114 displayed may reduce the processing time, latency and data volume for the transmission of records 114 when the device 101 calls the content server(s) 150 to execute a filter operation and return filtered records 114, as described in more detail regarding
In the starting UI display 310 of the filter mode 111, the display 310 also presents on the right vertical side of the device 101 the filter window 112. As described regarding
The UI display 310 also provides the following sections, the search subject matter section 121 labeled “Medical Devices,” the search entry section 123 with “Knee Brace” entered by the user 103 and the search results statistics section 125 with initial results of a volume of “6,011” results or records for the “Knee Brace” filter operation.
In addition to the button 108, two additional buttons 108 and 109 may be presented, which in this example are virtual and displayed on the UI display 310. Both of the buttons 108 and 109 may be activated using the same alternatives as button 108, including the construction of the buttons as physical or virtual and the activation mode of the buttons, such as for example, through touch screen interaction with the user 103. In this example, button 108 is labeled “Refine.” This button 108 may be activated by the user 103 to trigger a filter operation using one or more filter option(s) 118 selected in the filter window 112. As further described below, the filter operation applies the filter option selection(s) 120 from the filter window 112 to further refine the records 114 presented in the results window. By enabling multiple selections by the user 103 of filter option selection(s) and thereafter initiating the filter mode 111 filter operation upon selection of the “Refine” button 108, the user 103 may select multiple filter option selections 119(1)-(4) et. seq. to be aggregated as a filter set in the filter operation to produce a refined set of records 114. In another example, a filter operation may be applied upon a single filter option selection 120 within the filter window 112 so that any data entry into the filter window 112 initiates a refinement of the images VIF 115 shown in the results window. In the example of the
In addition, the “Sort” button 109 may server the same function of enabling the user 103 to change the sorting options for the records in the results window. With the limited screen real estate, however, until the filtered records 114 is of a manageable number of results which may be analyzed by the user 103, the changes in the display of the images VIF 115(1)-(x) may not be made at each refinement. For example, where the filtered record 114 volume of records is large, such as in the 100,000s, the refinement of filter results 114 may not result in any change to the visible images VIF 115(1)-(6) shown on the UI display 310. The impact of filtered records 114 refinement becomes more meaningful as filtered records 114 are reduced to a manageable volume, such as, for example, in the 100s or less, or within a reasonable range given the original database 251 to which the search is applied, so that the number is subjective for any user 103 but the likelihood of the images VIF 115(1)-(x) changing increases as the number of records approximates the number of visible images VIF 115 on the user interface 104 of the device 101.
The location and orientation of the results window and the filter window 112 are additional examples of different approaches to presenting the filtered records 114 on the screen real estate. In other examples, other orientations may include windows which are aligned in a horizontal orientation or stacked on top of one another, records from the windows may be intermixed and/or filter window 112 filter options 118 and filter option selection(s) 120 may be in a central location surrounded by results window 112 images VIF 115.
In addition, as also described regarding
In this way, the user 103 is supported in the filter mode 111 to both select filter option(s) 118 in order to execute iterative filter operations for the filtered records 114, while at the same time, dynamically and approximating real time, viewing and scrolling through the resulting filtered records 114 in the results window. In this manner, in filter mode 111, both of the results window and the filter window 112 are maintained as operable during processing initiated based on user 103 interaction with either window 110 or 112. When a user interacts with the filter window 112 by selecting one or more filter option(s) 118, the results window automatically updates the images VIF 115(1)-(6) in the results window 112 and maintains the results window 112 as active during subsequent user 103 interaction with the filter window 112. In addition, when a user 103 interacts with the results window, for example, by scrolling through the images VIF 115 representing the filtered records 114(1)-(6), the filter window 112 remains active and available for the user to input further filter option(s) 118, thereby refining the resulting filtered records 114. As a result, the user 103 may dynamically select filter option(s) 118 in the filter window 112 as input to initiate processing of the results window and concurrently view the results of the filter refinement in the results window without disabling the filter window 112. The records 114 therefore may be visually presented and dynamically refined to produce smaller results sets, thereby arriving at a consideration set that progresses to a manageable size of filtered records 114 for the user's 103 dynamic refinement of filtered results in a search and discovery process.
From the UI display 310, the user 103 also may return to the search mode 113 by selecting any of the images VIF 115(1)-(6). The user's 103 selection may return the device 101 to the search mode 113, such as is presented on the UI display 302 based on pre-determined sorting criteria, or a different pre-determined sorting criteria may be applied for results window selections in the case where the search mode 113 is restarted after the device 101 has been in the filter mode 111 in alternative examples. In the event that the UI display 310 is navigated back to a search mode 113, in one example, the filter window 112 is deactivated and therefore not available for selection by user 103 in the search mode 113 (as shown for example in the search mode 113 UI display 302 of
The
In response to the user 103 selecting the “Inventory” and “High” filter option 118(1) on a second UI display 310, another filter operation may be triggered. As a result, as shown in the third UI display 320, the filtered records 114 may be iteratively filtered to generate a subsequent set of refined filtered records 114, which may change the images VIF 115(1)-(6) displayed in the results window, as well as presenting a number volume of records as presented in results statistics section 125, with in this example “60 results” being displayed. The UI display 320 shows both the device 101 in the filter mode 111, with both the results window and the filter window 112 operable to receive further input or interactions by the user 103 such as a further selection of another filter option 118(1)-(4) or scrolling through the images VIF 115, respectively. This UI display 320 further presents the user 103 selecting filter option 118(2) “Clinical Review” and filter option selection 120(6) “High.”
In
In addition, where the total number of records 114 is within a range for which processing of the records 114, including an application of a filter operation to the records 114, may be completed within an acceptable time period in which the images 115 may be rendered on the results window 110, all of the processing may be executed locally at the device 101. In this alternative example, the device 101 does not send a request to the content server(s) 150 as the completion of processing may be managed by the computing architecture 105 of the device 101. The determination of an acceptable time period and a reasonable number of records 114 may be determined based on the specifications of the device 101 and standard processing time for local processing versus the use of computer network transmissions.
A fifth UI display 340 results from the user 103 filter option 118(3) and filter option selection 120(7) in which, based on data entry to the filter window 112, the filter operation may be executed and updates to the records 114 display of images VIF 115(1)-(4) and volume of filtered records are presented. With the total volume of filtered records 114 being four (4) and less than the predetermined number of image VIF 115(1)-(6) screen capacity, in this example, the images VIF 115 presented in the results window do not occupy the entire results window. This may be beneficial to the user 103 in visually presenting that the filtered records 114 is indeed small enough for consideration in a single UI display 340 without a need to scroll the images VIF 115(1)-(4). In an alternative example, the 4 images VIF 115(1)-(4) may be reformatted by increasing the dimensions of the images VIF 119(1)-(4) themselves to magnify viewing each image VIF 115(1)-(4), as well as to fill the capacity of the results window. In this example, the UI display 340 may provide a different indication that the images VIF 115(1)-(4) are the final filtered records 114 and no scrolling is needed, or on the other hand, scrolling may continue to be enabled for the reduced number of images VIF 115(1)-(4) such as earlier or similar results being added to the UI display 340, and indicated as such with some indication on the images VIF 115(5)-(x) of the basis for their addition.
In addition, in an alternative example, the UI display 340 may be used to illustrate an additional interoperability of the results window and the filter window 112. In this alternative example, the results window may be operable to support the user 103 indicating one of the images VIF 115(1)-(4), such as image VIF 115(3), in order to prompt the filter window 112 to display the filter options and filter option selection(s) 120(1)-(x) corresponding to the indicated image VIF 115(3). In this way, the dynamic operability of the multiple windows, the results window and the filter window 112, is presented in an additional manner to provide the user 103 with filter options and filter option selection(s) 120(1)-(x) which correspond to a selected image VIF 115(3). One example of the indication of the image VIF 115(3) in this example is a single tap of the touch screen 104 on the UI display 340, while a double tap may trigger a selection of the image VIF 115(3) in a manner of triggering a return by the device 101 to the search mode 113 from the filter mode 111. There are alternative approaches to supporting indications by the user 103 for the operations in these examples, such as swipe movements, hold movements etc. In addition, in this example, there are multiple approaches to presenting the filter option selection(s) 120(1)-(x) associated with the selected image VIF 115(3), including every one of the filter option selection(s) 120(1)-(x) stored at the contents server(s) 150 or a more limited set of the filter option selection(s) 120(1)-(x) including only those filter option selection(s) 120(1)-(x) which have been activated by the user 103 previously during the user's 103 interaction with the device 101 in the filter mode 111. In either case, the device 101 may send a request to the content server(s) 150 to identify the filter option selection(s) 120(1)-(x) associated with the selected image VIF 115(3).
The result of this additional interoperability in the filter mode 111 is shown in the UI display 350 with the image VIF 115(3) shown as selected and the filter window 112 presenting the one or more filter option(s) 118(x) and filter option selection(s) 120(1)-(x) which correspond to a selected image VIF 115(3). The filter options 118(1), (2) and (3) are shown as associated with the image VIF 115(3), which has been indicated by the user 103 based on a highlighting of the image VIF 115(3) with a border around the pictorial. In this example, the filter option(s) 118(1), (2) and (3) indicated are the accumulation of the filter selection option(s) 119(3), (6) and (7) in the preceding UI displays 320, 330 and 340 presented in
The content server(s) 150 processor(s) 206 then determines, in step 416, whether there are additional filtered records 114 to process. If there are additional filtered records 114, then the filter mode module 222 continues to process the additional filtered records 114 in step 406. If the complete set of filtered records 114 is processed; i.e., the processor(s) 206, in step 416, determines that there are no more filtered records 114 to be processed, then the filter mode module 222 processes the images and filter option selection(s) 120(1)-(x) in step 418. The processing may include storage, or cache, of the images and filter option selection(s) 120 at the content server(s) 150 images and filter option selection(s) database 259 as one example. In an alternative example, the images and filter option selection(s) 120 need not be stored at the content server(s) 150 images and filter option selection(s) database 259 and rather it may be solely transmitted for use by the device 101. In step 420, the images and filter option selection(s) 120(1)-(x) may be transmitted to the device 101 in response to the call by the device 101 with the transmission occurring over the network(s) 160.
In an alternative example, the content server(s) 150 may provide a set of images VIF 115(1)-(x) based on a predetermined filter option selection 119(x) to the device 101 in order to decrease the processing time of transmitting data over the computer network only upon the user 103 selection of a filter option selection 119(x). The content server(s) 150 processing may identify a filter option selection 119(x) that is commonly selected based on a preceding filter option selection 119(y). Then, the content server(s) may apply the predetermined filter option selection 119(x) to the records 114 in order to generate a filtered set of images VIF 115(1)-(x). This filtered set of images VIF 115(1)-(x) may then be transmitted to the device 101 so that in the event that the user 103 chooses the filter option selection 119(y), then the results images VIF 115(1)-(x) for presentation in the results window 110 are local at the device 101. Therefore, the processing time for the local resulting images VIF 115(1)-(x) corresponding to the predetermined filter option selection 119(y) will be shorter than the processing time where the resulting images VIF 115(1)-(x) are generated after the user 103 selection and subsequent call to the content server(s) 150 to provide a response. In another example of this pre-processing approach, the device 101 may transmit to the content server(s) 150 a request for a pre-fetch of images VIF 115(1)-(x) associated with filtered records based on a predetermined one of the plurality of filter option selections 119(y). The device 101 may identify the predetermined one of the plurality of filter option selections 119(y) or the content server(s) 150 may transmit the predetermined one of the plurality of filter option selections 119(y) to the device 101 to make it available to the device when the user 103 selects the predetermined one of the plurality of filter option selections 120(y). In addition, in another example of a pre-fetch operation, the device 101 can transmit a request to the content server(s) 150 to request the plurality of records 114 associated with current filter option selection 119(y) and/or its associated images VIF 115(1)-(x) in order to receive and store at the device 101 the additional information of the plurality of records 114. In this way, if the user 103 indicates a transition from the filter mode 111 and/or a selection of one of the images VIF(1)-(x), the device 101 may have on its local memory 107, the associated one of the plurality of records 114 for that selected one of the images VIF(1)-(x) and can readily with minimal processing time, display the addition information in the results window 110.
In the context of software, the blocks represent computer-executable instructions stored on one or more computer-readable media that, when executed by one or more processing units (such as hardware microprocessors), perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, components, data structures, and the like, that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation to embodiments of the invention. A client device, a remote content-item service, or both, may implement the described processes.
In one example of the process 600, at step 602, the device 101 processor(s) 106 determines the mode of the device 101 based on input by the user 103, which is detected by the user interface module 116. That is, the user interface module 116 determines whether the device 101 is in the filter mode 111 or the search mode 113. When the device 101 is in the filter mode 111, the filter mode module 122 is activated at step 606. When the device 101 is in the search mode 113, the search mode module 128 is activated at step 604.
The filter mode 111 operations are shown in steps 606 to 636. At steps 620 and 630, the filler mode module 122 activates both the results window module 124 and the filter window module 126 concurrently to support dynamic and mutual processing by each of the results window module 124 and filter window modules 126, respectively.
Continuing with the results window module 124 processing, at step 622, the module 124 enables the user 103 to execute a scrolling operation on the presentation of the filtered records 114, such as for example, interacting with the device 101 user interface 102 and touch screen 104 based on a number of options including touch and swipe, touch and hold, etc., to move a given image VIF 115(x) along the path of presentation of the images VIF 115(1)-(x). In the illustration shown in
The filter mode module 122 also processes the results window to detect other user interactions 103. For example, another option for the user 103 to interact with the images VIF 155(1)-(x) is to indicate, tap or double-tap on a given image VIF 115(x), with a double-tap being shown as one example in step 624. Continuing with this example, a double-tap may indicate that the image VIF 115 is selected in order to trigger a transition from the filter mode 111 to the search mode 113 of the device 101. There are alternative interactions which the user 103 may have with the image VIF 115(s), such as described above for the scrolling operation. If the user 103 interaction is not a double-tap, then the processor(s) 106 may then determine that a double-tap of the image VIF 115 (or record 114) has not occurred, in which case the processing 600 returns to the start of the filter mode step 606. In further alternative embodiments, other user 103 interactions may trigger other pre-determined actions with respect to image VIF 115(s), such as, for example, as is illustrated and described regarding FIG. 3B UI display 350. There are examples of a number of approaches and operations associated with interactions by the user 103 with image VIF 115s.
The filter mode 111 activation in step 606 also triggers the filter mode module 122 to process the filter window 112 at the same time as the results window, as described regarding
In one example, where a double-tap is pre-determined to initiate a transition from the filter mode 111 to the search mode 113, the processing 600 may continue as shown in step 624, where the user interface module 116 may detect the double-tap and the processor(s) 106 may then determine that a double-tap of the image VIF 115 (or record 114) has occurred. Based on a double-tap occurrence, in step 624, the processing 600, the processing 600 in step 628 then returns to the determination by the processor(s) 10, in step 602, as to the mode, either the filter mode 111 or the search mode 113, of the device 101.
From the step 602, and where the device 101 is in the search mode 113, the search mode module 128 is activated at step 604. The filter mode 113 operations are shown in steps 608 to 610. There are a number of approaches to initiating the search mode 113 for the device 101, such as when the device 101 is initialized, the search mode 111 may be the default mode, or there may be multiple user 103 interactions which may prompt the search mode 113, such as the double-tap of an image VIF 115, as described above in step 624, etc.
Upon activation of the search mode 113 in step 604, the search mode module 128 initiates the display of records 250 on the display 102 of the device 101. This is described in detail regarding
Then, in step 412, the filter mode module 222 combines multiple data fields from separate databases to generate in step 414 a new data structure, such as the image VIF 115 field of the record 114 (from the images database 254) combined with the filter option selection(s) 120(1)-(x) (from the filter option selections database 260) associated with the image VIF 115.
Although the subject matter has been described in language specific to structural features, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features described. Rather, the specific features are disclosed as illustrative forms of implementing the claims.