Contributions to Automatic Interaction Analysis of Meetings
Produktform: Buch / Einband - flex.(Paperback)
Psychological interaction analyses (IAs) of face-to-face meetings are typically based on a large number of audio and video recordings that must be carefully annotated manually before the actual analysis can start. Since this process is very time-consuming and costly, it limits not only the amount of data that can be taken into account but also the level of detail that can be reached in the analysis. Therefore, the automatic interaction analysis (AIA) of meetings has become a vital research topic, which aims to facilitate psychological interaction studies by developing methods for the automatic processing and analysis of meetings.
For an AIA of meetings based on acoustic data, high-quality audio recordings of each meeting participant are required. This is best achieved by multichannel audio recordings, in which each meeting participant is equipped with a close-talk microphone and recorded in an individual target microphone channel. However, crosstalk is a common problem with such audio recordings and disturbs the speech signals of the participants by each other. As a result, crosstalk significantly complicates the processing of the audio signals and also strongly affects the performance of speech analysis methods for an AIA.
For this purpose, this thesis presents an overall meeting emotion analysis system (OMEAS), which is able to deal with high-level crosstalk. It consists of three methods that in combination reduce the crosstalk in the target microphone signals, detect all utterances of the target speakers, and estimate the emotions of the speakers in each detected utterance. The provided output signals can then be used as basis for further analyses, so that the OMEAS already facilitates and improves the time-consuming annotation process for psychological IAs.weiterlesen
Dieser Artikel gehört zu den folgenden Serien