Description
|
This repository contains code and data for the publication "Representations of emotion concepts: Comparison across pairwise, appraisal feature-based, and word embedding-based similarity spaces" by Kwon, M., Wager, T., & Phillips, J. (2022), published in the Proceedings of the Annual Meeting of the Cognitive Science Society, 44(44) and can be found at https://escholarship.org/uc/item/8vj3d366. All code is written in R.
Updates We have updated our analyses based on feedback received since the annual meeting. The changes do not alter our main findings or conclusion. The included analysis script (`EMOCON_cogsci2022_revised.Rmd`) reflects these updates:
- Additional emotion concept pairs: Pairs involving `comfortableness`, `gratefulness`, `relaxedness`, `romanticness`, `sereneness`, `protectiveness` were added to the analysis, which were omitted in the previous version. The revised script includes all pairs and reflects changes in the feature-based similarity matrix, correlation between the similarity measures, loading values from principle component analysis on appraisal features, regression coefficients from multiple regression with the affective features components from PCA, and difference scores of the affective feature components.
- Scaling before PCA: PCA was rerun after rescaling data.
`script` Scripts used for analysis are included in a R markdown file.
`data` Data for appraisal feature rating, pairwise similarity rating, word embedding from word2vec models trained on Google news and Wikipedia are included. References for these data are listed below and also in the paper. Word embeddings from GPT3 can be accessed via [OpenAI](https://openai.com/)'s API, with relevant documentation on how-to found [here](https://beta.openai.com/docs/guides/embeddings/what-are-embeddings).
- Word embedding from W2V model trained on Google news: Mikolov, T., Sutskever, I., Chen, K., Corrado, G. S., & Dean, J. (2013). Distributed Representations of Words and Phrases and their Compositionality. Advances in Neural Information Processing Systems, 26.
- Word embeddings from W2v Model trained on Wikipedia: Fares, M., Kutuzov, A., Oepen, S., & Velldal, E. (2017). Word vectors, reuse, and replicability: Towards a community repository of large-text resources. Proceedings of the 21st Nordic Conference on Computational Linguistics, 271–276.
(2023-01-25)
|
Related Publication
| Kwon, M., Wager, T., & Phillips, J. (2022). Representations of emotion concepts: Comparison across pairwise, appraisal feature-based, and word embedding-based similarity spaces. Proceedings of the Annual Meeting of the Cognitive Science Society, 44.url: url |