Multimodal Emotion Recognition Papers
- A snapshot research and implementation of multimodal information fusion for data-driven emotion recognition
- Combining Eye Movements and EEG to Enhance Emotion Recognition
- Personality-Aware Personalized Emotion Recognition from Physiological Signals
- Adapting BERT for Target-Oriented Multimodal Sentiment Classification
- End-to-End Multimodal Emotion Recognition using Deep Neural Networks
- Multimodal Sentiment Analysis using Hierarchical Fusion with Context Modeling
- Multimodal Local-Global Ranking Fusion for Emotion Recognition
- Visual-Texual Emotion Analysis with Deep Coupled Video and Danmu Neural Networks
- EmoBed: Strengthening Monomodal Emotion Recognition via Training with Crossmodal Emotion Embeddings
- Multimodal Intelligence: Representation Learning, Information Fusion, and Applications
- M3ER: Multiplicative Multimodal Emotion Recognition using Facial, Textual, and Speech Cues
- Context Based Emotion Recognition using EMOTIC Dataset
- Cooperative Multimodal Approach to Depression Detection in Twitter
- VistaNet: Visual Aspect Attention Network for Multimodal Sentiment Analysis
- Multi-Interactive Memory Network for Aspect Based Multimodal Sentiment Analysis
- Predicting Emotions in User-Generated Videos
- Emotion Recognition using Multimodal Residual LSTM Network
- A review of affective computing: From unimodal analysis to multimodal fusion
- Multimodal Deep Learning Framework for Mental Disorder Recognition
- A Multimodal Deep Regression Bayesian Network for Affective Video Content Analyses
- Emotion Recognition in Context
- Context-Aware Emotion Recognition Networks
- EmotiCon: Context-Aware Multimodal Emotion Recognition using Frege's Principle
- A survey of multimodal sentiment analysis