The perception of sound greatly impacts users' emotional states, expectations, affective relationships with products, and purchase decisions. Consequently, assessing the perceived quality of sounds through jury testing is crucial in product design. However, the subjective nature of jurors' responses may limit the accuracy and reliability of jury test outcomes. This research explores the utility of facial expression analysis in jury testing to enhance response reliability and mitigate subjectivity. Some quantitative indicators allow the research hypothesis to be validated, such as the correlation between jurors' emotional responses and valence values, the accuracy of jury tests, and the disparities between jurors' questionnaire responses and the emotions measured by FER (facial expression recognition). Specifically, analysis of attention levels during different statuses reveals a discernible decrease in attention levels, with 70 percent of jurors exhibiting reduced attention levels in the 'distracted' state and 62 percent in the 'heavy-eyed' state. On the other hand, regression analysis shows that the correlation between jurors' valence and their choices in the jury test increases when considering the data where the jurors are attentive. The correlation highlights the potential of facial expression analysis as a reliable tool for assessing juror engagement. The findings suggest that integrating facial expression recognition can enhance the accuracy of jury testing in product design by providing a more dependable assessment of user responses and deeper insights into participants' reactions to auditory stimuli.

Facial Expression Recognition for Measuring Jurors’ Attention in Acoustic Jury Tests / Jamali, R.; Generosi, A.; Villafan, J. Y.; Mengoni, M.; Pelagalli, L.; Battista, G.; Martarelli, M.; Chiariotti, P.; Mansi, S. A.; Arnesano, M.; Castellini, P.. - In: SENSORS. - ISSN 1424-8220. - 24:7(2024). [10.3390/s24072298]

Facial Expression Recognition for Measuring Jurors’ Attention in Acoustic Jury Tests

Battista G.;
2024-01-01

Abstract

The perception of sound greatly impacts users' emotional states, expectations, affective relationships with products, and purchase decisions. Consequently, assessing the perceived quality of sounds through jury testing is crucial in product design. However, the subjective nature of jurors' responses may limit the accuracy and reliability of jury test outcomes. This research explores the utility of facial expression analysis in jury testing to enhance response reliability and mitigate subjectivity. Some quantitative indicators allow the research hypothesis to be validated, such as the correlation between jurors' emotional responses and valence values, the accuracy of jury tests, and the disparities between jurors' questionnaire responses and the emotions measured by FER (facial expression recognition). Specifically, analysis of attention levels during different statuses reveals a discernible decrease in attention levels, with 70 percent of jurors exhibiting reduced attention levels in the 'distracted' state and 62 percent in the 'heavy-eyed' state. On the other hand, regression analysis shows that the correlation between jurors' valence and their choices in the jury test increases when considering the data where the jurors are attentive. The correlation highlights the potential of facial expression analysis as a reliable tool for assessing juror engagement. The findings suggest that integrating facial expression recognition can enhance the accuracy of jury testing in product design by providing a more dependable assessment of user responses and deeper insights into participants' reactions to auditory stimuli.
2024
Facial Expression Recognition for Measuring Jurors’ Attention in Acoustic Jury Tests / Jamali, R.; Generosi, A.; Villafan, J. Y.; Mengoni, M.; Pelagalli, L.; Battista, G.; Martarelli, M.; Chiariotti, P.; Mansi, S. A.; Arnesano, M.; Castellini, P.. - In: SENSORS. - ISSN 1424-8220. - 24:7(2024). [10.3390/s24072298]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11381/2986734
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact