MULTI-SENSOR FUSION TECHNOLOGY FOR CREATING IMMERSIVE OUTDOOR VISUAL ART EXPERIENCES

Authors

  • Ms. Arpita A. Prajapati Lecturer, Faculty of Engineering, Gokul Global University, Sidhpur, Gujarat, India
  • Ponmurugan Panneerselvam Professor, Department of Research, Meenakshi College of Arts and Science, Meenakshi Academy of Higher Education and Research, Chennai, Tamil Nadu 600080, India
  • Bipin Sule Senior Professor, Department of DESH, Vishwakarma Institute of Technology, Pune, Maharashtra 411037, India
  • Dr. Sahaya Anselin Nisha A Professor, Department of Electronics and Communication Engineering, Sathyabama Institute of Science and Technology, Chennai, Tamil Nadu, India
  • Mr. Debanjan Ghosh Assistant Professor, Department of Computer Science and IT, Arka Jain University, Jamshedpur, Jharkhand, India
  • Mridula Gupta Centre of Research Impact and Outcome, Chitkara University, Rajpura- 140417, Punjab, India
  • Asha Rani G Assistant Professor, Meenakshi College of Arts and Science, Meenakshi Academy of Higher Education and Research, Chennai, Tamil Nadu 600080, India

DOI:

https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7480

Keywords:

Multi-Sensor Fusion, Immersive Art Systems, Outdoor Interactive Installations, Multimodal Data Processing, Creative Computing

Abstract [English]

Multi-sensor fusion technology has become a groundbreaking methodology of developing immersive outdoor visual art experiences through the process of combining heterogeneous data into adaptive systems of art. This paper introduces a general concept of a multi-sensor fusion-based art system, which uses visual (RGB-D cameras), motion (IMU), spatial (LiDAR), environmental (temperature, humidity, light), and biometric sensors to support dynamic, context-responsive art-related interactions in the outdoor setting. The suggested methodology makes use of multi-modal feature extraction as a way of visual patterns, motion dynamics, and environmental changes and then applies a hybrid fusion algorithm combining deep learning architectures and statistical models to provide a superior data synthesis and responsiveness to real-time requirements. A time-dependent collection unit assures time synchronization of sensor streams, improving the reliability of the system in different application tasks in the outdoor environment. The results of the experimental assessment show that these systems are much more accurate, responsive and user-engaging in comparison to single sensor and baseline systems. The findings point to improved spatial awareness and adaptive content creation, as well as fidelity of interaction, as a part of an increasingly immersive and individually-centered artistic experience.

References

Chen, H., Tan, Z., and Sun, P. (2024). Research on Wind Environment Simulation in Five Types of “gray Spaces” in Traditional Jiangnan gardens, China. Sustainability, 16(7765). https://doi.org/10.3390/su16177765

Chen, K., Meng, Z., Xu, X., She, C., and Zhao, P. G. (2024). Real-Time Interactions Between Human Controllers and Remote Devices in Metaverse. arXiv preprint arXiv:2407.16591. https://doi.org/10.1109/MetroXRAINE62247.2024.10795969

Dai, T., and Zheng, X. (2021). Understanding how Multi-Sensory Spatial Experience Influences Atmosphere, Affective City Image and Behavioural Intention. Environmental Impact Assessment Review, 89, 106595. https://doi.org/10.1016/j.eiar.2021.106595

Hatami, M., Qu, Q., Chen, Y., Kholidy, H., Blasch, E., and Ardiles-Cruz, E. (2024). A Survey of the Real-Time Metaverse: Challenges and Opportunities. Future Internet, 16(379). https://doi.org/10.3390/fi16100379

Howes, D. (2019). Multisensory Anthropology. Annual Review of Anthropology, 48, 17–28. https://doi.org/10.1146/annurev-anthro-102218-011324

Kenwright, B. (2020). There’s More to Sound than Meets the Ear: Sound in Interactive Environments. IEEE Computer Graphics and Applications, 40(3), 62–70. https://doi.org/10.1109/MCG.2020.2996371

Khetani, V., Gandhi, Y., Bhattacharya, S., Ajani, S. N., and Limkar, S. (2023). Cross-Domain Analysis of ML and DL: Evaluating Their Impact in Diverse Domains. International Journal of Intelligent Systems and Applications in Engineering, 11(7s), 253–262.

Spence, C. (2020). Senses of Place: Architectural Design for the Multisensory Mind. Cognitive Research: Principles and Implications, 5, 46. https://doi.org/10.1186/s41235-020-00243-4

Sun, H., and Chen, Y. (2024). A Rapid Response System for Elderly Safety Monitoring Using Progressive Hierarchical Action Recognition. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 32, 2134–2142. https://doi.org/10.1109/TNSRE.2024.3409197

Tang, M., Cai, S., and Lau, V. K. (2021). Over-the-Air Aggregation with Multiple Shared Channels and Graph-Based State Estimation for Industrial IOT Systems. IEEE Internet of Things Journal, 8(18), 14638–14657. https://doi.org/10.1109/JIOT.2021.3071339

Tang, M., Cai, S., and Lau, V. K. (2022). Online System Identification and Optimal Control for Mission-Critical IOT Systems Over MIMO Fading Channels. IEEE Internet of Things Journal, 9(21), 21157–21173. https://doi.org/10.1109/JIOT.2022.3175965

Ullo, S. L., and Sinha, G. R. (2020). Advances in Smart Environment Monitoring Systems Using IOT and Sensors. Sensors, 20(3113). https://doi.org/10.3390/s20113113

Vidya, M. (2025). The Interplay of Psychological and Cultural Factors in Consumer Decision-Making for Branded Apparel. International Journal of Recent Developments in Management Research, 14(1), 269–272.

https://doi.org/10.65521/ijrdmr.v14i1.684

Xu, R., Nikouei, S. Y., Nagothu, D., Fitwi, A., and Chen, Y. (2020). Blendsps: A Blockchain-Enabled Decentralized Smart Public Safety System. Smart Cities, 3(3), 928–951. https://doi.org/10.3390/smartcities3030047

Downloads

Published

2026-04-11

How to Cite

Prajapati, A. A., Panneerselvam, P., Sule, B., Nisha A, S. A., Ghosh, D., Gupta, M., & Rani G, A. (2026). MULTI-SENSOR FUSION TECHNOLOGY FOR CREATING IMMERSIVE OUTDOOR VISUAL ART EXPERIENCES . ShodhKosh: Journal of Visual and Performing Arts, 7(4s), 36–45. https://doi.org/10.29121/shodhkosh.v7.i4s.2026.7480