BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 is a schematic view of a conversational speech analysis according to the invention;
FIG. 2 is an image of the conversational speech analysis according to the invention;
FIG. 3 is a flow chart of the conversational speech analysis used in the invention;
FIG. 4 is a speech/nonspeech activity detection processing and corresponding flow chart;
FIG. 5 is a frame-based sound processing and corresponding flow chart;
FIG. 6 is a sensor activity detection processing and corresponding flow chart;
FIG. 7 is a frame-based sensor analysis process and corresponding flow chart;
FIG. 8 is an interest level judgment process and corresponding flow chart;
FIG. 9 is a display process and corresponding flow chart;
FIG. 10 is a speech database for storing frame-based sound information;
FIG. 11 is a database for storing frame-based sensor information;
FIG. 12 is an interest level database (sensor) for storing sensor-based interest levels;
FIG. 13 is an interest-level database (microphone) for storing microphone-based interest levels;
FIG. 14 is a customized value database for storing personal characteristics;
FIG. 15 is a database used for speaker recognition;
FIG. 16 is a database used for emotion recognition;
FIG. 17 is a time-based visualization of utterances by persons in the meeting; and
FIG. 18 is a time-based visualization of useful utterances in the meeting.