EmoEdit: Evoking Emotions through Image Manipulation

  • Jingyuan Yang
  • , Jiawei Feng
  • , Weibin Luo
  • , Dani Lischinski
  • , Daniel Cohen-Or
  • , Hui Huang*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

Affective Image Manipulation (AIM) seeks to modify user-provided images to evoke specific emotions. This task is inherently complex due to its twofold objective: evoking the intended emotion while preserving image composition. Existing AIM methods primarily adjust color and style, often failing to elicit precise, profound emotional shifts. Drawing on psychological insights, we introduce EmoEdit, which extends AIM by incorporating content modifications to enhance emotional impact. Specifically, we construct EmoEditSet, a large-scale AIM dataset of 40,120 paired data through emotion attribution and data construction. To make generative models emotion-aware, we design an Emotion Adapter and train it using EmoEditSet. We further propose an instruction loss to capture semantic variations in each data pair. Our method is evaluated both qualitatively and quantitatively, demonstrating superior performance over state-of-the-art techniques. Additionally, we showcase the portability of our Emotion Adapter to other diffusion-based models, enhancing their emotion knowledge with diverse semantics. Code is available at: https://github.com/JingyuanYY/EmoEdit.

Original languageEnglish
Pages (from-to)24690-24699
Number of pages10
JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
DOIs
StatePublished - 2025
Event2025 IEEE/CVF Conference on Computer Vision and Pattern Recognition, CVPR 2025 - Nashville, United States
Duration: 11 Jun 202515 Jun 2025

Bibliographical note

Publisher Copyright:
© 2025 IEEE.

Keywords

  • affective computing
  • affective image manipulation
  • aigc

Fingerprint

Dive into the research topics of 'EmoEdit: Evoking Emotions through Image Manipulation'. Together they form a unique fingerprint.

Cite this