Ayuda
Ir al contenido

Dialnet


Multimodal Assessment of Shopping Behavior

  • Autores: Mirela Carmia Popa
  • Localización: ELCVIA. Electronic letters on computer vision and image analysis, ISSN-e 1577-5097, Vol. 14, Nº. Extra 3, 2015 (Ejemplar dedicado a: Special Issue on Recent PhD Thesis Dissemination), págs. 1-3
  • Idioma: inglés
  • Enlaces
  • Resumen
    • Automatic understanding and recognition of human shopping behavior has many potential applications, attracting an increasing interest in the marketing domain. A first behavior cue regards the human movement patterns, then for obtaining a better overview of what is happening inside an environment, context information is used. More information regarding behavior can be extracted, by analyzing the interaction patterns with objects in the environment. Finally, facial expressions, which can be used to assess a person's reaction to an object or in our case study to a product are employed as another informative behavior cue. Each intermediary analysis stream (trajectory analysis, action recognition, ROI detection module, and facial expression analysis), provides an input to the reasoning model, which based on the observables formulates a hypothesis regarding the most likely behavioral model. We integrated the different types of information on the semantic level, by implementing a multi-level framework. Finally, we evaluated this system in the ShopLab, in a real supermarket, and the product appreciation in a laboratory setting. The results show the feasibility of the approach in the recognition of trajectories (93%), shopping actions (91.6%), action units (93%), facial expressions (84%), and the most important behavioral types (87%).


Fundación Dialnet

Dialnet Plus

  • Más información sobre Dialnet Plus

Opciones de compartir

Opciones de entorno