This application is based upon and claims the benefit of priority of the prior Japanese Patent Application No. 2013-272064, filed on Dec. 27, 2013, the entire contents of which are incorporated herein by reference.
The embodiment discussed herein is related to a method for inter-gadget display cooperation and an information processing apparatus.
In recent years, Linked Data is actively used as a technique of publishing data on the Web. Linked Data is a scheme of using the Web as global data space. While the current Web mainly functions as “a Web of documents for human readers”, Linked Data is compared with “a Web of data for machine processing.”
In screen generation on the Web, not only the contents of the screen are directly described in HTML and the like, but also the screen is generated so that data extracted from a database is displayed in the form of a graph. By using gadgets, pieces of data acquired from a plurality of databases associated by Linked Data can be displayed side by side in a single screen.
Thus, pieces of data associated by Linked Data are displayed side by side, so that information analysis and the like can be supported. In this regard, development of a platform for publishing data based on Linked Data is being pursued.
Non-patent Literature 1: Igata, Nishino, Kume, Matsuzuka, “Linked Data wo mochiita joho togo/katsuyo gijutsu (Information Integration and Utilization Technology Using Linked Data)”, FUJITSU. 64, 5 (September 2013)
Non-patent Literature 2: “Information Workbench” retrieved from the Internet on Dec. 4, 2013 <URL:http://www.fluidops.com/information-workbench/>
According to an aspect of an embodiment, a method for inter-gadget display cooperation includes acquiring a first query processing result using a first gadget to which first query processing allocated; acquiring a second query processing result using a second gadget to which second query processing allocated; and applying a common display mode to objects, which are included in the acquired first query processing result and second query processing result and of which display modes are to be common between the first gadget and the second gadget, in a display corresponding to the first gadget and in a display corresponding to the second gadget.
The object and advantages of the invention will be realized and attained by means of the elements and combinations particularly pointed out in the claims.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory and are not restrictive of the invention, as claimed.
Preferred embodiments of the present invention will be explained with reference to accompanying drawings. It is to be noted that these embodiments are not intended to limit the disclosed technology.
In the conventional display technology, each gadget independently operates, which makes it difficult to grasp correspondence relation between a plurality of gadgets in a display with use of these gadgets. The term “gadget” herein is used to refer to a component that extracts data from a database and processes and displays the extracted data.
For example, in
The configuration of an information processing apparatus according to an embodiment will be described. The information processing apparatus herein is a Web client for browsing a Web screen.
The cooperation unit 10 has a control unit 10a that performs control and a storage unit 10b that stores data for use in control and the like. The control unit 10a includes a query execution unit 11, an adjustment part reception unit 13, an execution synchronization unit 14, an adjustment value determination unit 15, an adjustment value instruction unit 16, and a screen generation adjustment unit 17. The storage unit 10b includes an execution result temporary storage 12.
The query execution unit 11 executes a query and stores a query execution result in the execution result temporary storage 12. Specifically, the query execution unit 11 executes a query so as to acquire data from a database and to generate screen data. The query execution unit 11 then stores the generated screen data in the execution result temporary storage 12.
More specifically, the query execution unit 11 receives a focus and a standpoint from a user, identifies a screen template corresponding to the standpoint, and executes gadgets included in the identified screen template to acquire data from databases.
The focus herein refers to an entity (substance) of interest, such as a company name, a person's name, a technical term, and an event. The standpoint signifies how the entity is observed. There are various standpoints for one entity. For example, in procurement, an orderer and an order receiver are standpoints. In product information, a manufacturer is a standpoint. The screen template is information which defines positions of graphs and maps on a screen and gadgets that display the graphs and the maps.
The query execution unit 11 retrieves from a database relating to the entity specified in the focus on the basis of the specified standpoint so as to acquire the data. A query for database retrieval is defined in association with a gadget. For example, if an organization name “X” is set as a focus, an “ordering organization” in procurement is set as a standpoint, and “retrieve order quantity per order company” is specified as a query, the query execution unit 11 retrieves from databases relating to the procurement by the organization name “X” from a standpoint of the “ordering organization,” and acquires data regarding the order quantity per ordering company. As for database retrieval, the query execution unit 11 makes a request to an information processing apparatus that stores databases via a network.
The execution result temporary storage 12 stores screen data generated by the query execution unit 11. The adjustment part reception unit 13 receives from a user an adjustment part specification that specifies items subjected to inter-gadget adjustment, and stores the specification in the storage unit in an extensible markup language (XML) format. The adjustment part reception unit 13 may use JavaScript (registered trademark) object notation (JSON) format instead of the XML format.
The adjustment ID is an identifier which identifies an adjustment part specification. The adjustment name is a name of the adjustment part specification. The adjustment ID is provided for machine processing, while the adjustment name is provided for users to determine the contents of adjustment. The adjustment gadget is a gadget subjected to adjustment. The adjustment item is an object subjected to adjustment, such as colors of respective segments of a pie chart and a value range of graph axes.
In the example illustrated in
The execution synchronization unit 14 synchronizes query executions. Specifically, the execution synchronization unit 14 checks whether or not execution results of a plurality of gadgets specified in the adjustment part specification have been obtained, and synchronizes executions of the plurality of gadgets.
More specifically, the execution synchronization unit 14 checks by confirming whether or not the execution results have been stored in the execution result temporary storage 12. The execution synchronization unit 14 may check whether or not execution results of the gadgets have been obtained by receiving from each of the gadgets a notification notifying whether or not query execution has been completed. In any case, the execution synchronization unit 14 checks, with respect to target adjustment, whether or not retrievals by the gadgets which are subjected to adjustment have been completed, and waits for completion of all the retrievals subjected to adjustment.
A user may interactively change or add gadgets, so that some gadgets to be adjusted may be inactive. An inactive state is provided in addition to an execution completion state and an executing state, so that the execution synchronization unit 14 determines completion of executions only for the gadgets that are not in the inactive state, and performs synchronization thereof.
The adjustment value determination unit 15 determines colors to be adjusted and scales (axis value ranges) to be adjusted, on the basis of the adjustment part specification received by the adjustment part reception unit 13. The adjustment value determination unit 15 may adjust and determine the reduced scales of maps.
a) illustrates a query result of a first gadget. In
The adjustment value determination unit 15 adjusts colors of the items by collating the same items in two gadgets with each other.
The adjustment value determination unit 15 then allocates colors to the merged items in an appearance order as illustrated in
The adjustment value determination unit 15 may perform color adjustment by using hash codes. For example, the adjustment value determination unit 15 allocates colors to the query result of the first gadget as follows:
A 10 color.hash(A)
B 9 color.hash(B)
Here, color.hash(A) represents a color corresponding to hash code A.
The adjustment value determination unit 15 also allocates colors to the second gadget as follows:
B 7 color.hash(B)
E 5 color.hash(E)
For example, when values of RGB are expressed by 00 to FF in hexadecimals, color values are determined to be in the range of 000000 to FFFFFF in hexadecimals. Accordingly, the hash values are set to be within this range. To avoid white and black colors, conditions may be added to the color range. Moreover, the number of colors (for example, 32 colors) to be used may be determined in advance, and color allocation may be performed in this range.
The adjustment value instruction unit 16 instructs to the gadgets the adjusted colors, the adjusted scales, the adjusted reduced scales of maps, and the like, which have been adjusted by the adjustment value determination unit 15. For example, when the colors have been adjusted by the adjustment value determination unit 15, the adjustment value instruction unit 16 instructs determined colors to the gadgets, so that the colors are instructed to the graph plotting unit which plots graphs.
As illustrated in
When the graph plotting unit has a default color order, the adjustment value instruction unit 16 instructs an order to the gadgets. The adjustment value instruction unit 16 sets an order of appearance of the items to be identical between the plurality of gadgets, and embeds an undefined value for items without a value. The adjustment value instruction unit 16 uses 0 as an undefined value, for example.
When colors are determined by using hash codes, the adjustment value instruction unit 16 instructs the colors determined by the hash codes to the gadgets.
The screen generation adjustment unit 17 adjusts screen data, such as graph representation and a display of maps, on the basis of the instruction by the adjustment value instruction unit 16.
A description will now be given of a flow of inter-gadget display cooperation processing performed by the cooperation unit 10.
Once the execution synchronization unit 14 synchronizes query executions (step S3) and achieves synchronization, the adjustment value determination unit 15 determines adjustment values on the basis of the adjustment part specification received by the adjustment part reception unit 13 (step S4).
Then, the adjustment value instruction unit 16 instructs adjustment of the values to the plurality of gadgets which are subjected to adjustment (step S5), and the screen generation adjustment unit 17 adjusts screen generation data on the basis of the instruction by the adjustment value instruction unit 16 (step S6).
Thus, the adjustment value instruction unit 16 instructs adjustment of the values to the plurality of gadgets on the basis of the values determined by the adjustment value determination unit 15. As a result, the cooperation unit 10 can achieve inter-gadget display cooperation.
A description will now be given of a flow of query execution processing performed by the query execution unit 11.
The query execution unit 11 determines a screen template associated with the standpoint (step S13), and determines a plurality of gadgets included in the screen template (step S14).
The query execution unit 11 then acquires a query, i.e., a query for one of the gadgets (step S15), and executes the acquired query (step S16) to generate gadget screen data (step S17). The query execution unit 11 repeats the processing of steps S15 to S17 by the number of the gadgets. The query execution unit 11 then generates page screen data (step S18), and stores the data in the execution result temporary storage 12.
Thus, the query execution unit 11 can generate the screen data about a screen displayed on the display apparatus by executing the plurality of gadgets included in the screen template associated with the standpoint.
A description will now be given of a flow of adjustment value determination processing performed by the adjustment value determination unit 15.
As illustrated in
If any of the execution results of the gadgets subjected to adjustment has not yet been read, the adjustment value determination unit 15 determines whether or not a target gadget is active (step S22). When the target gadget is not active, display by the gadget is not performed, and so the processing returns to step S21.
Contrary to this, if the target gadget is active, the adjustment value determination unit 15 reads the execution result of the gadget (step S23), and executes merge list addition processing configured to add the read execution result to a merge list (step S24). Then, the adjustment value determination unit 15 returns to step S21.
When there is no row to be merged, the adjustment value determination unit 15 ends the processing, whereas when there is still any row to be merged, the adjustment value determination unit 15 determines whether or not an item to be merged has already appeared (step S33).
When the item to be merged has not yet appeared, the adjustment value determination unit 15 adds a row and sets the item (step S34). The adjustment value determination unit 15 then adds a value to the new column (step S35), and returns to step S32.
Thus, the adjustment value determination unit 15 can collate the same items with one another among the gadgets by merging the execution results of the gadgets.
A description will now be given of a flow of adjustment value determination processing performed by using hash codes.
As illustrated in
If any of the execution results of the gadgets subjected to adjustment has not yet been read, the adjustment value determination unit 15 determines whether or not a target gadget is active (step S42). If the target gadget is not active, a display by the gadget is not performed, and the processing returns to step S41.
Contrary to this, if the target gadget is active, the adjustment value determination unit 15 reads the execution result of the gadget (step S43), and executes hash code color determination processing configured to determine the colors of the items included in the read execution result by using hash codes (step S44). Then, the adjustment value determination unit 15 returns to step S41.
When there is no row in the read execution result, the adjustment value determination unit 15 ends the processing, whereas when there is still any row in the read execution result, a hash value is obtained from the item (step S52).
The adjustment value determination unit 15 determines whether or not the obtained hash value is a hash value that has already appeared (step S53). If it is the hash value that has already appeared, then it is determined whether or not the item is identical (step S54). If the item is not identical, it means that the hash value is overlapped, and therefore the adjustment value determination unit 15 performs rehashing (step S56), and the processing returns to step S53.
When the item is identical, the adjustment value determination unit 15 determines a color from the hash value (step S55), and the processing returns to step S51. When the hash value is not a hash value that has already appeared, the adjustment value determination unit 15 determines a color from the hash value (step S55), and the processing returns to step S51.
Thus, the adjustment value determination unit 15 obtains hash values from the items and determines colors from the obtained hash values, so that the colors of the items can be unified among the gadgets.
As mentioned above, in the embodiment, the query execution unit 11 executes the plurality of gadgets, which form a screen, to retrieve from databases, and stores query results in the execution result temporary storage 12. The adjustment value determination unit 15 then reads out the query results from the execution result temporary storage 12, and adjusts the colors of items, the scales of graphs, the reduced scales of maps, and the like, which are subjected to adjustment among gadgets. Then, the adjustment value instruction unit 16 instructs adjustment values to the gadgets, and the screen generation adjustment unit 17 adjusts the screen including graphs, maps and the like, on the basis of the adjustment values. Thus, the cooperation unit 10 can achieve display cooperation among gadgets.
Although the cooperation unit 10 has been described in the embodiment, a program for inter-gadget display cooperation having the same functions may be obtained by implementing the configuration of the cooperation unit 10 in the form of software. Accordingly, a computer that executes the program for inter-gadget display cooperation will be described.
The main memory 31 is a memory that stores programs, middle results of executing the programs, and the like. The CPU 32 is a central processing unit that reads out a program from the main memory 31 and executes the program. The CPU 32 includes a chip set having a memory controller.
The LAN interface 33 is configured to connect the computer 3 to other computers via the LAN. The HDD 34 is a disk unit that stores programs and data. The super IO 35 is an interface for connecting input devices, such as a mouse and a keyboard. The DVI 36 is an interface that connects a liquid crystal display. The ODD 37 is a device that performs read and write access to DVDs. A screen where inter-gadget display cooperation was achieved is displayed on the liquid crystal display.
The LAN interface 33 is connected to the CPU 32 through a PCI express (PCIe), while the HDD 34 and the ODD 37 are connected to the CPU 32 through a serial advanced technology attachment (SATA). The Super IO 35 is connected to the CPU 32 through a low pin count (LPC).
The program for inter-gadget display cooperation executed in the computer 3 is stored in a DVD and is read out from the DVD by the ODD 37, before being installed in the computer 3. Or alternatively, the program for inter-gadget display cooperation is stored in databases and the like of other computer systems connected via the LAN interface 33 and is read out from these databases, before being installed in the computer 3. The installed program for inter-gadget display cooperation is stored in the HDD 34 and is read to the main memory 31 so as to be executed by the CPU 32.
Although the case where the gadgets display graphs and the like on the screen has been described in the embodiment, the present invention is not limited thereto. The present invention is similarly applicable to the case of achieving cooperation of other outputs, such as the gadgets outputting data to paper, i.e. achieving cooperation of outputs including display and printing among a plurality of gadgets.
Although the case of achieving display cooperation among a plurality of gadgets on one screen has been described in the embodiment, the present invention is not limited thereto. The present invention is similarly applicable to the case of achieving display cooperation among a plurality of gadgets on different screens.
According to one embodiment, it becomes possible to easily grasp correspondence relation between a plurality of gadgets in a display using these gadgets.
All examples and conditional language recited herein are intended for pedagogical purposes of aiding the reader in understanding the invention and the concepts contributed by the inventor to further the art, and are not to be construed as limitations to such specifically recited examples and conditions, nor does the organization of such examples in the specification relate to a showing of the superiority and inferiority of the invention. Although the embodiment of the present invention has been described in detail, it should be understood that the various changes, substitutions, and alterations could be made hereto without departing from the spirit and scope of the invention.
Number | Date | Country | Kind |
---|---|---|---|
2013-272064 | Dec 2013 | JP | national |