This project presents an innovative approach to human-computer interaction through a 3D flower avatar, a rose that visually represents human emotions through color and dynamic movements. The flower model is built in the Blender environment, and both the color and movements of the flower model are displayed in the Blender environment. Utilizing the pre-trained natural language processing model “Emotion English DistilRoBERTa-base," \cite{liu2019roberta} text input from the user is analyzed and is used to compute the scores of seven distinct emotions. These scores are then used to determine the transformation of the 3D flower model by altering its color and movement, thereby, ultimately conveying the user's emotion. The color and movement properties are being predicted using machine learning models to generate expressiveness. The datasets were built from scratch to train the machine learning models, which are then used to generate the color and animation parameters. The color of the flower is altered using the RGB values, whereas the movement of the flower is altered using location, rotation, and scale parameters. A multiouput regression model is used to predict the RGB values to facilitate dynamic adjustment of the flower's color in real-time. Similarly, an ensemble of machine learning models trained to predict animation parameters, enabling the flower to exhibit movements that reflect the user's emotions. The project's novelty lies in its interdisciplinary approach, combining emotion detection, machine learning, and 3D modeling to create an empathetic and interactive flower avatar. The implementation showcases the potential of blending artistic expression with technological innovation to foster more intense and expressive forms of digital interaction.
Download Blender Environment, the 3D Flower Model (Rose) as well as the animation parameters prediction model to get started. Run the python executable provided by Blender, open the 3D model, and run project-script.py by pasting the code inside code editor provided by blender. Upon running the script, you will be prompted to enter any text input. Upon entering the text input, you can view an expressive rose, including the animation. You can also train your own animation parameter prediction model, using the animation params dataset.