The present application claims priority to Chinese Patent Application No. 201310140770.1, filed on Apr. 22, 2013 and entitled “SYSTEM AND METHOD FOR AUTOMATICALLY FORMING HUMAN-MACHINE INTERFACE”, the entire disclosure of which is incorporated herein by reference.
The present disclosure generally relates to computer software, and more particularly, to a method and a system for automatically generating a human-machine interface.
Nowadays, Model-View-Controller (MVC) software design pattern has been widely used in various enterprise applications due to its fast deployment and low life cycle cost.
MVC design pattern realizes a separation of designing functional modules and display modules, such that some programmers (such as Java programmers) may focus on developing business logics, while interface programmers (such as HTML and JSP programmers) may focus on expression.
As shown in
Regarding the view layer, complex human-machine interfaces are required to facilitate mass data input operations which commonly exist in typical enterprise applications. As a result, even with the help of MVC design pattern to develop an enterprise application, developers still have to put a lot of efforts into designing these interfaces. Further, human-machine interfaces developed in such a pattern may have drawbacks like repeated human labor consumption, poor error correction, low reusability, high maintenance cost, and the like.
Therefore, methods for effectively reducing workload for developing view layer under MVC design pattern are needed, such that programmers can be more focus on developing business logics.
According to one embodiment, a method for automatically forming a human-machine interface is provided. The method may include:
defining a model object based on contents to be illustrated in the human-machine interface, where the model object may include at least one model component have a one-to-one mapping relationship with metadata in a database;
establishing a model-view corresponding to the defined model object, where the model-view may include at least one model-view element each of which having a mapping relationship with one of the at least one model component; and
analyzing the defined model object and the established model-view based on a predetermined syntax rule to form a model object configuration file and a human-machine interface configuration file, where the model object configuration file is adapted to provide mapping between the defined model object and the database, and the human-machine interface configuration file is adapted to illustrate the model object using the corresponding model-view.
In some embodiments, the model-view element may include at least one of a model-view element for displaying, a model-view element for creating, a model-view element for updating, a model-view element for searching, a model-view element for listing and a model-view element for item illustration.
In some embodiments, defining the model object may further include:
defining an association relationship between model components, where an illustration pattern of the human-machine interface for illustrating the defined model object in the human-machine interface corresponds to the defined association relationship, and the association relationship is represented in the human-machine configuration file after analysis.
In some embodiments, the association relationship between model components may include one or more of an aggregation relationship, a composition relationship and a multiplicity relationship.
In some embodiments, the illustration pattern of the human-machine interface may include one or more of link, pull-down list, data table, embedded form and form.
In some embodiments, the method may further include:
defining an operation button corresponding to a particular operation to a model component, where the particular operation corresponds to the model-view element; and
analyzing the operation button and representing, in the human-machine interface configuration file, relationship between the operation button and the model component and the model-view element.
In some embodiments, the particular operation may include one or more of display, create, update, delete and search.
In some embodiments, establishing the model-view corresponding to the defined model object may further include:
describing a property of the model-view element, where an illustration pattern for illustrating the model-view element in the human-machine interface corresponds to the property, and the property is represented in the human-machine interface configuration file after the analysis.
In some embodiments, the property may include one or more of button layout, button grouping, group embedding, button pattern, read-only, data format and event expression format.
In some embodiments, defining the model object and establishing the model-view corresponding to the defined model object are implemented using a natural expression language script.
In some embodiments, the method may further include:
before defining the model object and establishing the model-view corresponding to the defined model object, determining a syntax rule of the natural expression language, where the predetermined syntax rule may include the syntax rule of the natural expression language.
In some embodiments, the human-machine interface configuration file may include at least one kind of JAVA class codes, object relational mapping file and JAVA document.
According to one embodiment, a system for automatically forming a human-machine interface is provided, the system may include:
a model object unit, adapted to define a model object based on contents to be illustrated in the human-machine interface, where the model object may include at least one model component have a one-to-one mapping relationship with metadata in a database;
a model-view unit, adapted to establish a model-view corresponding to the defined model object, where the model-view may include at least one model-view element each of which having a mapping relationship with one of the at least one model component; and
an analysis configuration unit, adapted to analyze the defined model object and the established model-view based on a predetermined syntax rule to form a model object configuration file and a human-machine interface configuration file, where the model object configuration file is adapted to provide mapping between the defined model object and the database, and the human-machine interface configuration file is adapted to illustrate the model object using the corresponding model-view.
In some embodiments, the model-view element may include at least one of a model-view element for displaying, a model-view element for creating, a model-view element for updating, a model-view element for searching, a model-view element for listing and a model-view element for item illustration.
In some embodiments, the model object unit may further include:
an association unit, adapted to define an association relationship between model components, where an illustration pattern of the human-machine interface for illustrating the defined model object in the human-machine interface corresponds to the defined association relationship,
where the analysis configuration unit is further adapted to analyze the association relationship, and the association relationship is represented in the human-machine configuration file after the analysis.
In some embodiments, the system may further include:
an operation button unit, adapted to define an operation button corresponding to a particular operation to a model component, where the particular operation corresponds to the model-view element,
where the analysis configuration unit is further adapted to analyze the operation button and represent, in the human-machine interface configuration file, relationship between the operation button and the model component and the model-view element.
In some embodiments, the model-view unit may further include:
a property describing unit, adapted to describe a property of the model-view element, where an illustration pattern for illustrating the model-view element in the human-machine interface corresponds to the property,
where the analysis configuration unit is further adapted to analyze the property and the property is represented in the human-machine interface configuration file after the analysis.
Embodiments of the present disclosure may have following advantages.
By defining a model object and establishing a corresponding model-view, a configuration file for illustration a view corresponding to the model object may be automatically formed based on analysis. Therefore, the amount of code requiring manually input may be reduced.
A model-view layer is introduced between a model layer and a view layer, such that a relatively high level of coupling between fields in the model layer and the view layer may be replaced by a relatively low level of coupling between the model layer and the model-view layer. Therefore, button universality and expansibility may be increased.
In some embodiments, operation buttons may be provided, such that the model layer and the view layer can be associated through actions. Each kind of the actions may be expressed using a corresponding model-view element. Therefore, the coupling level between the model and the view may be further reduced. Furthermore, difficulties in realizing universal illustration buttons may be reduced, thus universality and usability may be improved.
a and 14b schematically illustrate rendered human-machine interfaces according to one embodiment of the present disclosure; and
Detail exemplary embodiments will be illustrated to provide thorough understanding of the present disclosure. Nevertheless, the present disclosure could be implemented in embodiments other than those described hereinafter. Those skilled in the art can make any variation and modification without departing from the scope of the disclosure. Therefore, the present disclosure may not be limited by embodiments disclosed hereinafter.
Besides, embodiments of the disclosure will be interpreted in detail in combination with accompanying drawings. It should be noted that the accompanying drawings are merely examples for illustrating embodiments of the disclosure, which should not limit the scope of the present disclosure.
Inventors, after analyzing a large number of cases for developing enterprise applications, found that remarkable labor may be wasted in repeatedly developing human-machine interfaces of enterprise applications using conventional design modes, the reason of which may lie in a high coupling degree between a view layer and a model layer in the conventional modes. Specifically, contents illustrated by the view layer are bound with particular fields in the model layer. As a result, even if different views may illustrate the same contents, repeated binding with field(s) in the model layer are still required, since the same contents are illustrated in different views. For example, a user may require a client record to be presented in a first page and a new client record updated based on adding of a new client to be presented in a second page (such as illustrating a result from an adding operation using an Add button). Although contents illustrated by both the first page and the second page are corresponding to a field containing client name/ID in the model layer, a binding between the view layer and the client name/ID field in the model layer is necessary for both the first page and the second page, since the second page is a new page triggered by the Add button.
In order to improve reusability of resulting interfaces, inventors attempted to reduce the coupling degree between the view layer and the model layer. By analyzing routine works in human-machine interface developing, it could be concluded that human-machine interfaces are normally required to provide several operations, which may be summarized into following six modes:
(1) Providing interface information for displaying a backend model object, which may be referred as Display;
(2) Providing interface information for creating a new model object, which may be referred as Create;
(3) Providing interface information for updating an existing model object, which may be referred as Update;
(4) Providing interface information for searching a model object, which may be referred as Search;
(5) Providing interface information for listing a collection of model objects, which may be referred as List; and
(6) Providing most characteristic item information of a model object, which may be referred as Item.
As described above, the number of operations required to be provided by interfaces is normally limited. Therefore, based on the required different operations, a model-view layer “Model-View” may be introduced into a conventional MVC design pattern, disposed between a view layer and a model layer, as shown in
In some embodiments, for further reducing the workload of developing the view layer and facilitating programmers focusing on business logics, configuration files of the view layer may be automatically formed using the new introduced model-view layer. In some embodiments, natural expression language (NEL) may be used to express contents in the model layer, model-view layer and the view layer, which is more complied with expression habits. Besides, NEL syntax rules may be defined to implement automatically forming the view layer configuration files.
A method for automatically forming a human-machine interface is provided in the present disclosure.
In S101: define a model object based on contents to be illustrated in a human-machine interface.
Specifically, in some embodiments, defining the model object may include defining model components of the model object. The model components may be mapped to metadata in a database, respectively. Those skilled in the art could understand that, based on the mappings, the model layer may implement certain processing to the corresponding metadata in the database and then present the metadata in a view of the view layer.
Further, in some embodiments, defining the model object may further include defining association relationship between the model components. Different illustration patterns may be used in the human-machine interface correspondingly to different association relations. It should be noted that the association relationship between the model components may include association relationship between model components in a same model object and association relationship between model components in different model objects.
Specifically, in some embodiments, the association relationship between the model components may include at least one selected from aggregation relationship, composition relationship and multiplicity relationship. Illustration patterns corresponding to the association relationship may include at least one selected from link, pull-down list, data table, embedded form and form.
Among the association relations, aggregation relationship is a relatively weak association relationship. Model components having association relationship between each other have independent life cycles. Therefore, when a user defines an aggregation relationship for a model component A and a model component B, normally the particular contents of the model component B may be obtained from a link or a pull-down list of the model component A in a human-machine interface.
Hereunder gives some example for illustrating defining association relationships. It should be noted that, in the present disclosure, names of model objects are not necessarily the same as characters illustrated in corresponding interfaces for presenting the model objects.
Among the association relationship listed above, composition relationship is a relatively strong association relationship. A life cycle of a composed model component may depend on a life cycle of its parent model component. When a parent model component A is deleted, a model component B having composition relationship with it may also be deleted. Therefore, if the user defines a composition relationship between the model component A and the model component B, normally the particular contents of the composition relationship may be represented using an embedded form in a human-machine interface.
Among the association relationship listed above, multiplicity relationship represents a one-to-many association relationship of model components, which may be one-to-many aggregation relationship or one-to-many composition relationship. If the user defines a one-to-many aggregation relationship between the model components, normally the one-to-many aggregation relationship may be represented using a data table in a human-machine interface. If the user defines a one-to-many composition relationship between the model components, normally the one-to-many composition relationship may be represented using a form in a human-machine interface.
Referring to
In some embodiments, establishing a model-view corresponding to the defined model object may include defining at least one model-view element of the model-view and establishing mapping relationships between the model-view elements and the model components of the defined model object. The model-view may at least have one model-view element each of which is corresponding to one of the at least one model component. Such that, mapping may be established between the model-view(s) and the model component(s).
In some embodiments, the model-view elements may be used for generating at least one kind of view selected from the DisplayView, the CreateView, the UpdateView, the SearchView, the ListView and the ItemView.
Specifically, the DisplayView may be adapted to express how to provide, for the user, interface information in the model object.
The CreateView may be adapted to express how to provide, for the user, interface information for creating a new model object.
The UpdateView may be adapted to express how to provide, for the user, interface information for updating an existing model object.
The SearchView may be adapted to express how to provide, for the user, interface information for searching a model object.
The ListView may be adapted to express how to provide, for the user, interface information for listing a collection of model objects.
The ItemView may be adapted to express how to provide, for the user, most characteristic item information of a model object.
In some embodiments, establishing a model-view corresponding to the defined model object may further include: expressing at least one property of the model-view element. The at least one property may determine visual effects of the model-view element presented in a human-machine interface. In some embodiments, the property may include one or more of button layout, button grouping, group embedding, button pattern, read-only, data format and event expression.
Specifically, in some embodiments, the button layout may include elements for organizing and arranging various model-views in a human-machine interface, such as column number, width, line feed, etc.
In some embodiments, the button grouping may provide container control (such as defining a frame) in a human-machine interface. The button layout may be arranged with in the container.
In some embodiments, the group embedding may define that not only buttons can be arranged in a container, but also another container can be arranged in the former container.
In some embodiments, the button pattern may include illustration patterns for illustrating various model-view elements, such as pull-down list, cascade list, etc.
In some embodiments, the data format may define illustrating various model-view elements in a human-machine interface with a particular data format. For example, a mapping exists between a model-view element and a model component regarding date, then a data format for illustrating the model component in a human-machine interface may be defined as “mm-dd-yy”, or the like.
In some embodiments, the event expression may provide interactions in module level, such as modifying event, loading event, etc.
Referring to
Those skilled in the art could understand that both the model object configuration file and the human-machine interface configuration file may be static files. On the basis that mapping is already established between the model-view element and the model components, the above described static files are able to be automatically generated based on syntax analysis.
It should be noted that, the configuration files are configured to be automatically formed to reduce workload for developing the view layer. However, in order to support specific illustration requirement, manually writing some parts of the codes may be support in some embodiments. Therefore, the user can have more options and better experience.
It should be noted that, if the user defines the association relationship between the model components and/or expresses properties of the model-view element, the association relationship and the model-view element will be represented in the human-machine interface configuration file formed after the analysis.
Those skilled in the art could understand that, in some embodiments, defining the model object and establishing the model-view may be implemented through script edit. Therefore, in some embodiments, syntax rules and meanings may be determined before the above described processing is implemented. It should be noted that determining the syntax rules and meanings is not illustrated in
In some embodiments, controller part developing may be implemented to provide operation buttons in the human-machine interface to be formed, such that the user can implement particular operations to particular model components by triggering the operation buttons. In some embodiments, referring to
In some embodiments, after S107, the process 100 may include S109: analyze the operation button, and represent, in the human-machine interface configuration file, relationship between the operation button and the model components and the model-view element.
Thereafter, the human-machine interface configuration file may be loaded in a user terminal using a rendering engine, such that a human-machine interface can be illustrated at the user terminal. By clicking the operation button provided in the human-machine interface, various operations may be implemented to data which are in the database and corresponding to the model object. Operation results may be illustrated at the user terminal in a new human-machine interface or an update of the formed human-machine interface.
To clarify the present disclosure, an exemplary embodiment will be illustrated hereinafter, in which a particular human-machine interface is provided, illustrating automatic settlement result for equally shared meal cost.
Referring to
Thereafter, in S203: NEL script edit. A natural expression language (NEL) script is edited to define a model object and establish a model-view.
As shown in
Referring to
Further, properties of the model-view elements are described. For example, in the CreateView invoking the model component mealDate, a two-column illustration pattern is defined, i.e., “columns=2”, and “rowbreak” is set for the second column. Besides, data format properties are expressed using “properties” in the NEL script. For example, all the model-view elements invoking the model component mealDate illustrate the model component mealDate in a format of “MM-dd-yy”.
Referring to
If an error occurs in the process 300, an error reminder may be generated to remind the user, such that the user can adjust the NEL script.
Referring to
a and 14b pages obtained by rendering the human-machine interface configuration file formed in the process 200.
a illustrates a visual effect of illustrating the CreateView in the human-machine interface, in which the model components mealDate, recordrestaurent, payer, participants, cost and averageAmount invoked by the CreateView are illustrated. The one-to-many aggregation relationship between the model component participants and other model components is illustrated using a data table.
b illustrates a visual effect of illustration the SearchView in the human-machine interface, in which an operation button Search is provided to implement search function using the model component mealDate as a key work. The model component mealDate is illustrated in the human-machine interface with a “MM-dd-yy” format.
Those skilled in the art may know that, parts or all of the present disclosure may be implemented based on a combination of software and a general hardware platform according to above description of embodiments. Therefore, embodiments of the present disclosure may be accomplished in a computer software product substantially. The computer software product may include at least one computer readable medium which has computer executable instructions stored therein. When the executable instructions are implemented by at least one device selected from a computer, a computer network and other electronic devices, the at least one device can implement operations according to embodiments of the present disclosure. The computer readable medium may include but not be limited to floppy disk, optical disk, Compact Disc-Read Only Memory (CD-ROM), magneto-optical disc, Read-Only Memory (ROM), Random-Access Memory (RAM), Erasable Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-only memory (EEPROM), magnetic card, optical card, flash or any other medium/computer readable medium which is adapted to store computer executable instructions.
Embodiments of the present disclosure may be implemented in a plurality of general or special computer system environment or devices, such as a personal computer (PC), a sever computer, a handheld device, a portable device, a pad, a multi-processor system, a system based on microprocessor, a set-top box, a programmable consumer electronic device, a network PC, a mini computer, a mainframe computer and a distributed computing environment containing any of the above systems or devices.
Embodiments of the present disclosure may be realized by a context including executable instructions which can be implemented by a computer, such as a program module. Generally, a program module includes routines, programs, objects, components and data structures which can implement particular tasks or realize particular abstract data categories. Embodiments of the present disclosure may be implemented in a distributed computing environment, where tasks may be implemented by remote processing devices which are coupled in a communication network. In the distributed computing environment, program modules may be located in local and remote computer storing mediums, such as a memory.
Those skilled in the art could understand that the above described components may be programmable logic devices, such as one or more selected from programmable array logic (PAL), generic array logic (GAL), field programmable gate array (FPGA), complex programmable logic device (CPLD), which may not be limited here.
Embodiments of the present disclosure further provide a system for automatically forming a human-machine interface.
The model object unit U10 may be adapted to define a model object based on contents to be illustrated in the human-machine interface. The model object may include at least one model component having mapping relationship with metadata in a database. In some embodiments, the model object unit U10 may further include: an association unit U11, adapted to define an association relationship between the model components, where a pattern for illustrating the model components in the human-machine interface may correspond to the association relationship.
The model-view unit U20, connected to the model object unit U10, may be adapted to establish a model-view corresponding to the defined model object. In some embodiments, the model-view may include at least one model-view element which has a mapping relationship with at least one model component. The model-view element may be used for generating at least one of a DisplayView, a CreateView, an UpdateView, a SearchView, a ListView and an ItemView. In some embodiments, the model-view unit U20 may further include: a property describing unit U21, adapted to expressing the property of the model-view element, where a pattern for illustrating the model-view element in the human-machine interface corresponds to the property.
The operation button unit U30, connected with the model-view unit U20, may be adapted to define an operation button. The operation button may correspond to a particular operation to the model component, which particular operation may correspond to the model-view element.
The analysis configuration unit U40, connected with the operation button unit U30, may be adapted to analyze the defined model object and the established model-view based on a predetermined syntax rule to form a model object configuration file and a human-machine interface configuration file. The model object configuration file may be adapted to provide mapping between the model object and the database. The human-machine interface configuration file may be adapted to illustrate the model object using a corresponding model-view. In some embodiments, the analysis configuration unit U40 may be further adapted to analyze the association relationship defined by the association unit U11, where the analyzed association may be represented in the human-machine interface configuration file. In some embodiments, the analysis configuration unit U40 may be further adapted to analyze the property expressed by the property describing unit U21, where the analyzed property may be represented in the human-machine interface configuration file. In some embodiments, the analysis configuration unit U40 may be further adapted to analyze the operation button defined by the operation button unit U30, where the relationship between the operation button and the model component and the model-view element may be represented in the human-machine interface configuration file.
The disclosure is disclosed, but not limited, by preferred embodiments as above. Based on the disclosure of the disclosure, those skilled in the art can make any variation and modification without departing from the scope of the disclosure. Therefore, any simple modification, variation and polishing based on the embodiments described herein is within the scope of the present disclosure.
Number | Date | Country | Kind |
---|---|---|---|
201310140770.1 | Apr 2013 | CN | national |