SENSOR-INTEGRATED DIGITAL CANVASES ALLOWING ARTISTS TO MANIPULATE PAINTINGS THROUGH GESTURAL INPUTS
DOI:
https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7462Keywords:
Sensor-Integrated Canvas, Gesture Recognition, Interactive Digital Art, Human-Computer Interaction, Real-Time RenderingAbstract [English]
The digital canvases that are sensor-integrated is another new twist in the world of interactive visual art since artists can now dictate the digital paintings by natural gestures. The given paper gives a full architecture of the construction and introduction of such systems based on the integration of multimodal sensing technology like motion sensors, depth cameras, inertial measurement units (IMUs), and touch-sensitive interfaces. The proposed system architecture has a potent gesture processing unit and high quality rendering unit making access to art in real time easy. Gesture recognition and classification is a gradual process algorithm that is developed using machine learning and allows a correct analysis of hand movements and body actions. These gestures are dynamically assigned to the processes of art such as brush strokes, tweezing of textures, mixing of colors and geometrical changes. Moreover, the adaptive learning capabilities of the system also tailor the response to the user interaction patterns, making the system more expressive in the field of creativeness. An edited list of gestures and a calibrated hardware system is experimentally tested with significant improvements in workspace accuracy, response rate, and control compared to the traditional digital input applications. The results show how the system has the potential of rebranding the digital art production by offering more interactive, natural and responsive space to creative work across different artists.
References
Anadol, R. (2022). Space in the Mind of a Machine: Immersive Narratives. Architectural Design, 92, 28–37. https://doi.org/10.1002/ad.2810
Canbeyli, R. (2022). Sensory Stimulation Via the Visual, Auditory, Olfactory, and Gustatory Systems Can Modulate Mood and Depression. European Journal of Neuroscience, 55, 244–263. https://doi.org/10.1111/ejn.15507
Cao, Y., Han, Z., Kong, R., Zhang, C., and Xie, Q. (2021). Technical Composition and Creation of Interactive Installation Art Works Under the Background of Artificial Intelligence. Mathematical Problems in Engineering, 2021, 7227416. https://doi.org/10.1155/2021/7227416
Capece, S., and Chivăran, C. (2020). The Sensorial Dimension of the Contemporary Museum Between Design and Emerging Technologies. IOP Conference Series: Materials Science and Engineering, 949, 012067. https://doi.org/10.1088/1757-899X/949/1/012067
Duarte, E. F., and Baranauskas, M. C. C. (2020). An Experience with Deep Time Interactive Installations within a Museum Scenario. Institute of Computing, University of Campinas.
Liu, J. (2021). Science Popularization-Oriented Art Design of Interactive Installation based on the Protection of Endangered Marine Life—The Blue Whales. Journal of Physics: Conference Series, 1827, 012116. https://doi.org/10.1088/1742-6596/1827/1/012116
Pan, J., He, Z., Li, Z., Liang, Y., and Qiu, L. (2020). A Review of Multimodal Emotion Recognition. CAAI Transactions on Intelligent Systems, 7.
Raptis, G. E., Kavvetsos, G., and Katsini, C. (2021). Mumia: Multimodal Interactions to Better Understand Art Contexts. Applied Sciences, 11, 2695. https://doi.org/10.3390/app11062695
Savaş, E. B., Verwijmeren, T., and van Lier, R. (2021). Aesthetic Experience and Creativity in Interactive Art. Art and Perception, 9, 167–198. https://doi.org/10.1163/22134913-bja10024
Szubielska, M., Imbir, K., and Szymańska, A. (2021). The Influence of the Physical Context and Knowledge of Artworks on the Aesthetic Experience of Interactive Installations. Current Psychology, 40, 3702–3715. https://doi.org/10.1007/s12144-019-00322-w
Velasco, C., and Obrist, M. (2021). Multi-Sensory Experiences: A Primer. Frontiers in Computer Science, 3, 614524. https://doi.org/10.3389/fcomp.2021.614524
Xu, S., and Wang, Z. (2021). DIFFUSION: Emotional Visualization Based on Biofeedback Control by EEG—Feeling, Listening, and Touching Real Things Through Human Brainwave Activity. Artnodes, 28. https://doi.org/10.7238/artnodes.v0i28.385717
Published
How to Cite
Issue
Section
License
Copyright (c) 2026 Al Yusra Sikander, Siddharth Sriram, Dr. Chetanaba G. Rajput, Ponmurugan Panneerselvam, Uma S, Dr. M. Sugadev

This work is licensed under a Creative Commons Attribution 4.0 International License.
With the licence CC-BY, authors retain the copyright, allowing anyone to download, reuse, re-print, modify, distribute, and/or copy their contribution. The work must be properly attributed to its author.
It is not necessary to ask for further permission from the author or journal board.
This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.























