SENSOR-INTEGRATED DIGITAL CANVASES ALLOWING ARTISTS TO MANIPULATE PAINTINGS THROUGH GESTURAL INPUTS

Authors

  • Al Yusra Sikander Assistant Professor, Department of Computer Science and Engineerin (AIML), Noida Institute of Engineering and Technology, Greater Noida, Uttar Pradesh, India
  • Siddharth Sriram Centre of Research Impact and Outcome, Chitkara University, Rajpura- 140417, Punjab, India
  • Dr. Chetanaba G. Rajput Assistant Professor, Faculty of Arts, Gokul Global University, Sidhpur, Gujarat, India
  • Ponmurugan Panneerselvam Professor, Department of Research, Meenakshi College of Arts and Science, Meenakshi Academy of Higher Education and Research, Chennai, Tamil Nadu 600080, India
  • Uma S Associate Professor, Meenakshi College of Arts and Science, Meenakshi Academy of Higher Education and Research, Chennai, Tamil Nadu 600080, India
  • Dr. M. Sugadev Associate Professor, Department of Electronics and Communication Engineering, Sathyabama Institute of Science and Technology, Chennai, Tamil Nadu, India

DOI:

https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7462

Keywords:

Sensor-Integrated Canvas, Gesture Recognition, Interactive Digital Art, Human-Computer Interaction, Real-Time Rendering

Abstract [English]

The digital canvases that are sensor-integrated is another new twist in the world of interactive visual art since artists can now dictate the digital paintings by natural gestures. The given paper gives a full architecture of the construction and introduction of such systems based on the integration of multimodal sensing technology like motion sensors, depth cameras, inertial measurement units (IMUs), and touch-sensitive interfaces. The proposed system architecture has a potent gesture processing unit and high quality rendering unit making access to art in real time easy. Gesture recognition and classification is a gradual process algorithm that is developed using machine learning and allows a correct analysis of hand movements and body actions. These gestures are dynamically assigned to the processes of art such as brush strokes, tweezing of textures, mixing of colors and geometrical changes. Moreover, the adaptive learning capabilities of the system also tailor the response to the user interaction patterns, making the system more expressive in the field of creativeness. An edited list of gestures and a calibrated hardware system is experimentally tested with significant improvements in workspace accuracy, response rate, and control compared to the traditional digital input applications. The results show how the system has the potential of rebranding the digital art production by offering more interactive, natural and responsive space to creative work across different artists.

References

Anadol, R. (2022). Space in the Mind of a Machine: Immersive Narratives. Architectural Design, 92, 28–37. https://doi.org/10.1002/ad.2810

Canbeyli, R. (2022). Sensory Stimulation Via the Visual, Auditory, Olfactory, and Gustatory Systems Can Modulate Mood and Depression. European Journal of Neuroscience, 55, 244–263. https://doi.org/10.1111/ejn.15507

Cao, Y., Han, Z., Kong, R., Zhang, C., and Xie, Q. (2021). Technical Composition and Creation of Interactive Installation Art Works Under the Background of Artificial Intelligence. Mathematical Problems in Engineering, 2021, 7227416. https://doi.org/10.1155/2021/7227416

Capece, S., and Chivăran, C. (2020). The Sensorial Dimension of the Contemporary Museum Between Design and Emerging Technologies. IOP Conference Series: Materials Science and Engineering, 949, 012067. https://doi.org/10.1088/1757-899X/949/1/012067

Duarte, E. F., and Baranauskas, M. C. C. (2020). An Experience with Deep Time Interactive Installations within a Museum Scenario. Institute of Computing, University of Campinas.

Liu, J. (2021). Science Popularization-Oriented Art Design of Interactive Installation based on the Protection of Endangered Marine Life—The Blue Whales. Journal of Physics: Conference Series, 1827, 012116. https://doi.org/10.1088/1742-6596/1827/1/012116

Pan, J., He, Z., Li, Z., Liang, Y., and Qiu, L. (2020). A Review of Multimodal Emotion Recognition. CAAI Transactions on Intelligent Systems, 7.

Raptis, G. E., Kavvetsos, G., and Katsini, C. (2021). Mumia: Multimodal Interactions to Better Understand Art Contexts. Applied Sciences, 11, 2695. https://doi.org/10.3390/app11062695

Savaş, E. B., Verwijmeren, T., and van Lier, R. (2021). Aesthetic Experience and Creativity in Interactive Art. Art and Perception, 9, 167–198. https://doi.org/10.1163/22134913-bja10024

Szubielska, M., Imbir, K., and Szymańska, A. (2021). The Influence of the Physical Context and Knowledge of Artworks on the Aesthetic Experience of Interactive Installations. Current Psychology, 40, 3702–3715. https://doi.org/10.1007/s12144-019-00322-w

Velasco, C., and Obrist, M. (2021). Multi-Sensory Experiences: A Primer. Frontiers in Computer Science, 3, 614524. https://doi.org/10.3389/fcomp.2021.614524

Xu, S., and Wang, Z. (2021). DIFFUSION: Emotional Visualization Based on Biofeedback Control by EEG—Feeling, Listening, and Touching Real Things Through Human Brainwave Activity. Artnodes, 28. https://doi.org/10.7238/artnodes.v0i28.385717

Downloads

Published

2026-04-11

How to Cite

Sikander, A. Y., Siddharth, Rajput, C. G., Panneerselvam, P., Uma S, & Sugadev, M. S. (2026). SENSOR-INTEGRATED DIGITAL CANVASES ALLOWING ARTISTS TO MANIPULATE PAINTINGS THROUGH GESTURAL INPUTS. ShodhKosh: Journal of Visual and Performing Arts, 7(4s), 229–237. https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7462