The present invention relates to a united system and method of a virtual reality technology integrating various digital data created in planning and in designing a product to operate a virtual product and visualizing the virtual product in photo-realistic level, an affective technology organizing a customer's emotional evaluation on a product's design in view of engineering, an ergonomic technology quantatively measuring and analyzing body force activity involved in the operation of a product in a biomechanic manner, and a mixed reality technology supporting both tangible interface capable of directly touching the digital data and a photo-realistic visualization.
This work was supported by the IT R&D program of MIC/IITA [2005-S-604-03, Realistic Virtual Engineering Technology Development].
As well known in the art, a virtual product operation technology is widely used in CAD and applied computer graphics as a technology displaying an operation process of a virtual product in accordance with a change in time by adding kinematics information and animation information to product data (e.g., CAD data) using a 3D modeling tool on the product's appearance.
Further, HLSL (High Level Shader Language) based technology rendering 3D model data into a photo-realistic image such as 3D model data has been used widely in computer graphics through the 1990s and, for example, in 2000, as Cg technology of Nvidia.
However, algorithm handling a complicated pathway tracking of light, reflection considering property of an object and the like requires high performance computing power and resources in order to obtain a product of a photo-realistic level with high quality. Therefore, in virtual reality researching where real-time processing is very important, the abovementioned technologies are integrated and researches that enable demonstration of a realistic virtual product are in progress.
Meanwhile, in affective technology, a technology that deducts the emotional receptive results (e.g., linguistic representation) that the customer feels from a product into a relationship between an input value (e.g., physical data and personal feeling of a design element of a product) and an output value (e.g., emotional satisfaction expression score of a product's design) using an organized method is being developed.
That is to say, the design of a specific product is disassembled into detailed elements in an aspect of HCI based affective technology, and emotional evaluation data collection test is performed on a product group combined of such elements with respect to an experimentee group of people. More, an algorithm (an estimation formula of an estimation score) capable of estimating an emotional evaluation index of a specific customer group on an arbitrary product in accordance with a constant rule (e.g., evaluation criterion) is developed by establishing an equation that determines a weight of a constant input parameter through statistically analyzing the interrelationship between an emotional satisfaction of a user and a design element.
However, the conventional art described above undergoes a test procedure and an analysis and estimation formula modeling procedure by an experimentee group on a subject product already launched, which takes a lot of time. Therefore, a design preference trend of customers changes very often and thus it is difficult to analyze in a shortened time the current market status against products of various designs being launched.
Thus, in ergonomics, researches are conducted to improve with priority given to a human convenience usability problems induced during designing a feature focused product appearance and a user interface in the past. More, even in the case of information appliances, user interfaces and product appearances are being improved by applying ergonomic analyzing technologies. Especially, in the 21st century, hands (or fingers) operations are increased due to the popularization of minimized personal information appliances and researches for analyzing side effects (e.g., VDT syndrome) aroused from culture background and solving such problems is in progress.
However, ergonomic analysis technology developed up to date (e.g., JACK system suggested by Pennsylvania University, U.S.A.) is a technology developed on the subject operation (e.g., factory production line operation) focusing on the action of the whole body. Accordingly, the ergonomic analysis technology has a limit in simulating the situation induced from a minute hand action and finger and analyzing such problems. Moreover, there is a need to develop an exclusive experiment device for a hand interface considering convenience of the experimentee group for organized ergonomic experiments.
In addition, a mixed reality technology is an improved technology in which a virtual reality technology having superior interaction features is supplemented with an augmented reality technology developed focusing on image conformity and synthesizing. Most examples of applying existing mixed reality technology are supporting interaction with a user by mixing a virtual entity to a low resolution video image.
For example, an issue haptic feedback (a virtual reality technology simulating physical impact and contact phenomenon) presents in research example of HITLab in New Zealand and in interaction with a virtual entity is considered important and various haptic interface technologies are developed in the art. However, since it is not good enough to satisfy the requirements of 100% completely simulating the feeling of a hand handling (especially, precisely on such a small product as a cellular phone which is the subject of the present invention) a real object, there are researches in progress trying to overcome the defects of the technology by way of applying a multimodal interaction technique using another stimulus (e.g., sound effect).
Therefore, the mixed reality technology at present is still in an insufficient state for an image expression and operation in a photo-realistic level of a virtual product required in a virtual usability evaluation scenario and for interaction operation feature support identical to an actual product experience state in a mixed reality environment.
Therefore, the technical problems are contrived by the technical limits and necessity of development as described above, and the solutions of the major four technical problems are presented, as follows:
First technical problem is to easily modify a design in a software tool environment and visualization in a photo-realistic level, to simulate in real-time a physical action of a product real-time and a operation of a S/W embedded in the product by integrating various digital data produced from the individual operation processes of product visualization and action simulation.
Second technical problem is to present an embodied technology supporting an estimation feature of customer's preference in accordance with a modification of a product's design and to solve the problem “fast update of an emotional user evaluation estimation model” in an affective technical evaluation support technology.
Third technical problem is to present a technique of designing and employing a technology of a glove-type interface device for supporting a quantitative analysis in a usability evaluation scenario of a product used by hands and to analyze a user interface evaluation and improvements of a product by utilizing such technology.
Forth technical problem is to provide a technique of designing and operating a usability evaluation platform based on mixed reality for supporting a feature of various aspects (e.g., vision, audition, tactility, cognitive results in accordance with product operation, etc.) sensible by a user from an end-product launched in the market since a technology that enables ideal virtual usability evaluation test is one embodying a mixed reality technology.
In order to solve the technical problems above, it is an object of the present invention is to provide a usability evaluation system and method of a virtual mobile information appliance which unites a virtual reality technology integrating various digital data created in planning and in designing a product to operate a virtual product and visualizing the virtual product in photo-realistic level, an affective technology organizing a customer's emotional evaluation on a product's design in view of engineering, an ergonomic technology quantatively measuring and analyzing body force activity involved in the operation of a product in a biomechanic manner, and a mixed reality technology supporting both tangible interface capable of directly touching the digital data and a photo-realistic visualization.
In accordance with an aspect of the present invention, there is provided a system for evaluating usability of a virtual mobile information appliance, including:
a design evaluation unit for supporting emotional evaluation of a designed product in accordance with a component DB and a partially standardized guide in view of a customer and for real-time collecting design preference data based on a network online system;
a virtual product design modification and action simulation unit for uniting digital data related with the designed product to realize a photo-realistic visualization and a virtual operation;
an ergonomic based hand load evaluation unit for measuring an ergonomic based hand load and a fatigue using a hand interface based usability evaluation tool; and
a mixed reality usability evaluation unit for applying an augmented reality technology and a printing technology to the realized photo-realistic visualization and the virtual operation and for creating a usability evaluation situation based on the measured ergonomic based hand load and fatigue to provide the created usability evaluation situation to a user.
In accordance with another aspect of the present invention, there is provided A method for evaluating usability of a virtual mobile information appliance, including:
supporting an emotional evaluation in view of a customer on a product designed in accordance with a component DB and a partially standardized guide and real-time collecting design preference data based on a network online system;
realizing a photo-realistic visualization and a virtual operation by uniting digital data related with the designed product;
measuring an ergonomic based hand load and fatigue using a hand interface based usability evaluation tool; and
applying an augmented reality technology and a printing technology to the realized photo-realistic visualization and virtual operation, and creating a usability evaluation situation based on the measured ergonomic based hand load and fatigue to provide the created usability evaluation situation to a user.
The above and other objects and features of the present invention will become apparent from the following description of embodiments, given in conjunction with the accompanying drawings, in which:
Hereinafter, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The present system includes a design evaluation group 1000, a virtual product design modification and action simulation group 2000, an ergonomic based hand load evaluation group 3000 and a mixed reality usability evaluation platform group 4000. In this embodiment, the present invention further includes an online product evaluation tool T1, an online product design creation/modification tool T2, a hand interface based usability evaluation tool T3 and a virtual prototyping based usability evaluation tool T4, which are an embodiment independently executable by integrating some or all major features respectively.
The design evaluation group 1000 provides an evaluation service in an emotional aspect of a user on a product's design and provides a feedback thereof in the system. As depicted in
In addition, the design evaluation group 1000 is executed by a design evaluation methodology based on affective technology being applied in industrial engineering. That is to say, in the design evaluation group 1000, emotional results that the customer feels on a product's design are classified in a form of linguistic expression (e.g., high class, satisfaction, etc.) and the interrelationship between them and product design parameters are deducted. Thus, what influence to a degree of an emotional design satisfaction of all products is estimated by the change of detailed design parameters, and, the estimation is provided to the virtual product design modification and action simulation group 2000.
Further, the design evaluation group 1000 supports emotional evaluation in a customer's aspect of a product designed in accordance with a component DB and a partially standardized guide while passing through the processes performed in a product design evaluation element selection unit 1200 to a product design evaluation analysis result DB 1700, and collects in real-time design preference data based on a network online system. The collected design preference data are then stored in the product design evaluation analysis result DB 1700. In general, the processes performed from the product design evaluation element selection unit 1200 to the product design evaluation analysis result DB 1700 are made generally in design emotional evaluation test and analysis; and, therefore, detailed description thereof will be omitted.
After going through the above processes, a recursive equation capable of estimating a degree of emotional satisfaction of a customer about a kind of the product design defined by constant design parameter is obtained. The recursive equation is used as a model equation for a design emotional score estimation equation. This may be stored in the product design (component) united DB 2110 or may yield an immediate emotional satisfaction score of a product design newly added or modified by the online product design creation/modification tool T2.
The emotional satisfaction score may control a priority of exposure of various components stored in the current product design (component) united DB 2110 and provide immediately a evaluation feedback score to a user on the basis of currently designed product's parameter. Therefore, the emotional satisfaction score can be used in supporting to achieve a design result that is satisfactory in the customer market by a user.
The online product design evaluation tool T1 is a tool that enables an experimental process that has been performed offline to restricted experimentees to be made by multiple experimentees that access an online system such as a web service, allows them to evaluate the product design, and then enters the evaluation score thereof. That is to say, the process of the design emotional evaluation estimation modeling unit 1400 is performed offline to a group of the restricted experimentees and is performed based on the experiment result of the product designs launched in a market at present; and, therefore, the online product design evaluation tool T1 presents a feature that supplements the restrictions on the estimation function of the design in the future in the conventional offline experiment based model creation methodology.
The virtual product design modification and action simulation group 2000 performs to present an expanded platform. The virtual product design modification and action simulation group 2000, as depicted in
Then, the digital data of the virtual product used in virtual product design modification and action simulation group 2000 is stored in the product design (component) united DB 2110. Digital data produced through the use of general CAD programs (e.g., CATIA, AutoCAD, 3DS Max, etc.), 2D design programs (e.g., Adobe Flash, etc.) and photo-realistic programs (e.g., Nvidia's Cg code) are stored in the product design contents related DB 2100. The digital data produced by such various programs are converted and integrated into a series of data format (e.g., COLLADA format, etc.) by a product design data format unification unit 2200 and are stored in the product design (component) united DB 2110.
A virtual product structure organizing unit 2300 constructs the design data of the virtual product in a format having a pre-defined detailed (subordinate) structure. The virtual product organizing editing unit 2300 utilizes a technique used in 3D computer graphics and virtual reality simulation, to thereby express hierarchically component structure of the virtual product, e.g., in Tree or Graph form, and defines structural movement information of each component and movement forms, e.g., animation, responding to an external input event. Such structural information of the virtual product is stored in a product assembling information DB 2400. An automatic assembling support processing unit 2410 processes to continuously sustain the interrelationship between components, e.g., by performing modification of 3D geometry information using restriction of position movement, automatic size adjustment and CAD operation, when a design parameter is modified by the online product design creation/modification tool T2 pursuant to spatial interrelationship (e.g., constraint information including parent-child subordinate relationship, group relationship, etc.) of predefined components.
A virtual product (component) design adjustment unit 2420 adjusts a physical design parameter between the components automatically assembled under creating and modifying the design and modifies the design parameter automatically or manually to obtain a natural assembled product. For example, in case of applying automatic modification, the virtual product (component) design adjustment unit 2420 automatically converts property information of a newly added component, e.g., a button, for unification with surrounding tones and material information thereof or transforms the relationship between geometry and ambient geometry of the newly added component on the basis of a constant rule or constraint condition.
A virtual product action editing unit 2500 functions to insert mechanism information into each component stored in the product design (component) united DB 2110. A virtual product visualization property editing unit 2510 functions to correct material and property information of the virtual product stored in the product design (component) united DB 2110. The functions performed in the virtual product action editing unit 2500 and virtual product visualization property editing unit 2510 can be widely employed in a general 2D or 3D design program.
A user interface control unit 2700 enables a virtual product united model visualization unit 2530 to visualize the program to be interfaced, the components in which material and property information of the corrected product and mechanism information thereof are inserted and the corrected design parameter as a virtual product action in a united form by applying a real-time screen capturing method to a plurality of product design programs executed in parallel with the online product design creation/modification tool T2. That is to say, most of the product design programs only produce shape data of a physical user interface (PUI), i.e., an external appearance of the virtual product, and support only photo-realistic visualizing in a photo-realistic level. Further, embedded software executed in the virtual product attaches only an image captured on a screen to the virtual product in a form of texture map or simulates a movement of the product by a video file. However, embedded software executed in the information appliance uses a GUI simulation program (e.g., interactive menu execution in Adobe Flash) to produce or test the virtual product.
Therefore, a virtual result output unit 2600 simulates an action of the virtual product having a completely united form of portions of the PUI and the GUI by updating in real-time the texture map by capturing 2D GUI information image on a 3D virtual object expressing PUI like an real product by a user interface 2700. Further, a user's input through a keyboard or a mouse is transmitted to a PUI visualization program and a GUI visualization program executed in parallel by applying an interface hooking technology.
Referring to
The ergonomic based hand load evaluation group 3000 measures an ergonomic based hand load and fatigue through a simulation tool to provide the same to the virtual product design modification and action simulation group 2000. The ergonomic based hand load evaluation group 3000, as depicted in
The real-time hand tracking interface 3100 includes sensors that track a shape of a hand in real-time and obtains in real-time angles of all joints in the hand so that a virtual hand model is restored and visualized in real-time. For example, an angel between fingers and posture information of the hand may be trackable using Cyberglove having 22 sensors available from Immersion Corporation. Then, the attained angle information of the joints go through a series of data conversion and calibration filter for controlling the virtual hand model conformed to a user by a real-time virtual hand model control unit 3200.
The hand force measurement interface device 3500 includes various sensors for tracking a force load phenomenon induced in the hand using the virtual product. For example, the various sensors include pressure sensors and electromyogram (EMG) sensors. The sensor values obtained in real-time go through an adjustment such as a calibration filter and a sensibility adjustment and digital signal conversion by a hand force measurement interface control unit 3300.
The virtual hand model control data obtained by the real-time hand tracking interface 3100 and the real-time virtual hand model control unit 3200; and the force load measurement on the hand obtained by the hand force measurement interface control unit 3100 and the hand force measurement interface device 3500 are then output in a form of the virtual hand model visualized with the force load measurement by the force measurement result visualization unit 3400. That is to say, a user can objectively measure a feeling of the hand currently handling the virtual product by using the sensor values. Further, the virtual product model action visualization unit 3600 visualizes a current action status of the virtual product in a same manner with the visualization result output unit 2600 of
A force test result recording unit 3700 records a test currently in progress through video and audio recording devices and stores the control data for controlling the virtual hand model, an action status of the virtual hand model and an action situation of the virtual product which will then be used in analyzing after testing.
Moreover, a technology performed in the ergonomic based hand load evaluation group 3000 may support a quantitative usability evaluation test with respect to an interface operation of a current product or a product to be launched. For example, it is possible to compare a user's performance under circumstances of a UMPC keyboard layout or to measure a hand fatigue under circumstances of a UMPC keyboard layout. That is to say, the ergonomic based hand load evaluation module 3000 may be applied in a comparison process between a UMPC model with a keyboard laid out on both sides and a UMPC model with a keyboard slid in the bottom of a screen.
Referring back to
The entity movement tracking unit 4400 tracks in real-time the virtual product and movement of a user's hand and head and then transmits the same to the mixed reality image control unit 4300.
The mixed reality image control unit 4300 controls a parameter (e.g., rendering parameter of a virtual camera) in order to create a mixed reality image based on a mixed reality image display configuration and a tracked movement information. The mixed reality image creates the mixed reality image and the mixed reality image is projected and overlapped on a real object using a overly (an image overlapping) method (e.g., optical see-through method) through the mixed reality usability evaluation test device 4100.
A major function performed in the mixed reality based usability evaluation platform group 4000, i.e., a linking function between the physical user interface (PUI) and the graphic user interface (GUI) is realized by the product design contents related DB 2100 and blocks 4500 to 4820 depicted in
As described above, the produced 3D printing data physically is employed to assembly a product in the 3D printing for the mixed reality 4700 with an electronic component (e.g., a keypad) using the 3D printing device (e.g., Z-printer of Z Corp.).
The 3D printing control unit 4500 interfaces an electronic/electric signal of PUI actually operating. An input value applied to the system goes through a mixed reality image/3D printing synchronization unit 4600 and is transmitted to the mixed reality image control unit 4300 for updating an action result of the virtual product. For example, in accordance with a direction button pressed by a user, the status of the GUI program performing in a parallel manner in the user interface control unit 2700 of
Referring to
In embodiment of
In order to support the above functions, a software method (e.g., parameter adjustment of a graphic card driver) or a hardware method (e.g., a display panel and a display component panel are turned over and mounted to correspond to a turn over effect of the mirror) may be employed. Further, the display panel has a device using a non-reflective coating and a polarized filter for preventing a multi-image focus phenomenon due to the repetitive reflection (e.g., in case two mirrors face each other under an angle of 180 degrees, a phenomenon where a reflective image of an opposite mirror is reflected repetitively) which may be induced by facing a reflection unit of the display panel. “A image reflection unit capable of adjusting light penetration” depicted in
On the other hand,
An exposure priority of the components is automatically adjusted by constant criterion information (e.g., an order of an emotional satisfaction result of the currently selected component, an order of statistical frequency in use of the currently selected component, etc.). Criterion capable of being presented is a recording of using the online design product evaluation tool T1 of
For example, the embodiment of the hand interface based usability evaluation tool T3 shown in
To be specific, a sensor T3-1 measures a tension of a muscle in charge of a finger's movement and a force on a finger's joint correspondent to a hand's movement through the use of sensors, e.g., EMG sensors attached on a specific position in accordance with a criterion in a ergonomic test guidance. An area T3-2 is implemented with a form of a pressure glove as demonstrated in the present embodiment. Attached on the palm of the pressure glove are a thin film type of pressure sensors interfering minimally the interaction between a hand and a contact surface of the product.
Signals measured by both the EMG sensors and the pressure sensors are entered into in the program through a sensibility adjustment area T3-3 having a calibration filter and a digital-analogue convertor T3-4. Areas T3-1-1 and T3-2-1 are graphic user interfaces monitoring the sensor values in a real-time from the EMG sensors and the pressure sensors, and an area T3-6 is a graphic user interface for allowing a user to observe intuitively a distribution of the pressure values by matching numerical values to a tone spectrum. Further, an area T3-7 is a graphic user interface for more intuitively displaying a posture and pressure of the user's hand and the muscle fatigue information using a 3D hand model.
The embodiment of
The emulation program transmits the signal from the keypad to a program parallel executed therewith, e.g., Adobe Flash that simulates GUI, e.g., GUI for a menu control on a screen, thereby simulating a modified shape of the cellular phone on the screen when a user presses a button of a real product. This simulation result is completed as an image of the cellular phone via the mixed reality image creation unit 4200 having a photo-realistic rendering function and the image is overlapped on a 3D printing (a cellular phone mockup) held by a user by way of an optical image projection structure of the mobile mixed reality display device. In case a user operates the cellular phone, a 3D spatial movement on a 3D space is tracked by a six degree of freedom tracking device and is real-time updated as the mixed reality image. In case a user presses a button on the keypad, a portion of GUI is updated accordingly.
On the other hand,
According to the present invention, design preference information of current market (customer) and a prototype of a candidate product that assist to determine a new design may be visualized to a user under a product planning and a 2D styling design as shown in
Therefore, the present invention unites various digital data created during a product planning and designing to operate the digital data as one virtual product, and unites a virtual reality technology which visualizes the virtual product to a photo-realistic level, an affective technology which organizes a customer's emotional evaluation for a product's design in view of engineering, an ergonomic technology which quantatively measures and analyzes dynamical body activity involved in the operation of the product in view of biomechanic, and a mixed reality technology which supports both a tangible interface capable of directly touching the digital data and a photo-realistic visualization. Accordingly, it is possible to find problems of usability early, obtain improvements such as a design of the product, improve efficiently an overall quality of the product and manage product-lifetime-cycle in a company manufacturing the product.
While the present invention has been described with respect to certain preferred embodiments, it will be apparent to those skilled in the art that various changes and modifications may be made without departing from the scope of the invention as defined in the following claims.
Number | Date | Country | Kind |
---|---|---|---|
10-2007-0131904 | Dec 2007 | KR | national |