Ayuda
Ir al contenido

Dialnet


Facial emotion recognition using context based multimodal approach

  • Autores: Priya Metri, Jayshree Ghorpade, Ayesha Butalia
  • Localización: IJIMAI, ISSN-e 1989-1660, Vol. 1, Nº. 4, 2011, págs. 12-15
  • Idioma: inglés
  • Enlaces
  • Resumen
    • Emotions play a crucial role in person to person interaction. In recent years, there has been a growing interest in improving all aspects of interaction between humans and computers. The ability to understand human emotions is desirable for the computer in several applications especially by observing facial expressions. This paper explores a ways of humancomputer interaction that enable the computer to be more aware of the user�s emotional expressions we present a approach for the emotion recognition from a facial expression, hand and body posture. Our model uses multimodal emotion recognition system in which we use two different models for facial expression recognition and for hand and body posture recognition and then combining the result of both classifiers using a third classifier which give the resulting emotion . Multimodal system gives more accurate result than a signal or bimodal system


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno