APML, a Markup Language for Believable Behavior Generation

  • De Carolis B
  • Pelachaud C
  • Poggi I
  • et al.
N/ACitations
Citations of this article
61Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Developing an embodied conversational agent able to exhibit a human- like behavior while communicating with other virtual or human agents requires enriching the dialogue of the agent with nonverbal information. Our agent is dened as two components: a Mind and a Body. Her mind reects her personality, her social intelligence as well as her emotional reaction to events occurring in the environ- ment. Her body corresponds to her physical appearance able to display expressive behaviors. We designed a Mind-Body interface that takes as input a specication of a discourse plan in an XML language (DPML) and enriches this plan with the communicative meanings that have to be attached to it, by producing an input to the Body in a new XML language (APML). Moreover we have developed a language to describe facial expressions. It combines facial basic expressions with operators to create complex facial expressions. The purpose of this paper is to describe these lan- guages and to illustrate our approach to the generation of behavior of an agent able to act consistently with her goals and with the context in which the conversation takes place

Cite

CITATION STYLE

APA

De Carolis, B., Pelachaud, C., Poggi, I., & Steedman, M. (2004). APML, a Markup Language for Believable Behavior Generation (pp. 65–85). https://doi.org/10.1007/978-3-662-08373-4_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free