Coding Small Group Communication with AI: RNNs and Transformers with Context

被引:0
|
作者
Pilny, Andrew [1 ,2 ]
Bonito, Joseph [3 ]
Schecter, Aaron [4 ]
机构
[1] Univ Kentucky, Dept Commun, Lexington, KY USA
[2] Univ Kentucky, Dept Sociol, Lexington, KY USA
[3] Univ Arizona, Dept Commun, Tucson, AZ USA
[4] Univ Georgia, Terry Coll Business, Management Informat Syst, Athens, GA USA
基金
美国国家科学基金会;
关键词
communication; content analysis; meetings; interaction analysis; ARGUMENT;
D O I
10.1177/10464964251314197
中图分类号
B849 [应用心理学];
学科分类号
040203 ;
摘要
This study compares the performance of recurrent neural networks (RNNs) and transformer-based models (DistilBERT) in classifying utterances as dialogue acts. The results show that transformers consistently outperform RNNs, highlighting their usefulness in coding small group interaction. Furthermore, the study explores the impact of incorporating context, in the form of preceding and following utterances. The findings reveal that adding context leads to modest improvements in model performance. Moreover, in some cases, adding context can lead to a slight decrease in performance. The study discusses the implications of these findings for small group researchers employing AI models for text classification tasks.
引用
收藏
页数:30
相关论文
共 50 条