Technology that detects user personality based on user speech signals must be researched to enhance the function of interaction between a user and virtual agent that takes place through a speech interface. In this study, personality patterns were automatically classified as either extroverted or introverted. Personality patterns were recognized based on non-verbal cues such as the rate, energy, pitch, and silent intervals of speech with patterns of their change. Through experimentation, a maximum pattern classification accuracy of 86.3% was achieved. Using the same data, another pattern classification test was manually carried out by people to see how well the automatic pattern classification of personal traits performed. The results in the second manual test showed an accuracy of 86.6%. This proves that the automatic pattern classification of personal traits can achieve results comparable to the level of performance accomplished by humans. The Silent Intervals feature of the automatic pattern classification performed admirably while in the second test done by people, pitch was a key factor in producing better accuracy. This information will be useful and applicable in future studies. © 2013 Copyright the authors.
CITATION STYLE
Kwon, S., Yeon Choeh, J., & Lee, J. W. (2013). User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations. International Journal of Computational Intelligence Systems, 6(4), 739–749. https://doi.org/10.1080/18756891.2013.804143
Mendeley helps you to discover research relevant for your work.