pp. 135-146
S&M3503 Research Paper of Special Issue https://doi.org/10.18494/SAM4537 Published: January 24, 2024 Color Palette Generation of Mixed Color Images Using Autoencoder [PDF] Tzren-Ru Chou and Jie-Yun Shao (Received May 31, 2023; Accepted January 9, 2024) Keywords: color palette, color image, color filter, autoencoder
The color sensor, fundamental in designing image filter algorithms for smartphones, is often developed on the basis of the demands of commercial imaging applications. The creation of a color palette is an essential step for designers when planning color schemes. In this study, we introduce a method for generating blended color imagery, called the ‘color image’ in some previous works, (8,9) to construct personalized palettes that cater to individual emotional needs. We modified the training approach of the autoencoder, incorporating data related to emotions into the training samples. This allowed us to establish a correspondence between color imagery and palettes on the basis of Kobayashi’s color image scale. Through linear interpolation calculations between different types of imagery, we derived the emotional coordinates of blended color imagery. Using these derived coordinates, we fed them into the trained autoencoder model to reconstruct the generated palette for blended color imagery. We conducted a visual evaluation experiment, and the results showed that the emotional conveyance of the generated blended color imagery palettes is consistent with human perception. Additionally, we presented two sample applications: emotional filters and background frameworks. We anticipate that the findings of this study can offer a new perspective for the development and application of the color sensor.
Corresponding author: Tzren-Ru ChouThis work is licensed under a Creative Commons Attribution 4.0 International License. Cite this article Tzren-Ru Chou and Jie-Yun Shao, Color Palette Generation of Mixed Color Images Using Autoencoder, Sens. Mater., Vol. 36, No. 1, 2024, p. 135-146. |