PLATFORM GOVERNANCE AND ALGORITHMIC ACCOUNTABILITY IN SHAPING PUBLIC DISCOURSE

Authors

  • Dr. Arpita Sneh Research Scholar, Veer Bahadur Singh Purvanchal University, Jaunpur, India
  • Dr. Bhavna Upadhyaya Assistant Professor, Jagran School of Journalism, Jagran Lakecity University, Bhopal, India
  • Dr. Prakash Mishra Faculty, Makhanlal Chaturvedi National University of Journalism and Communication, Bhopal, India
  • Sarthak Kumar Assistant Professor, Mody University of Science and Technology, Lakshmangarh, India
  • Mayank Jain Assistant Professor, Mangalayatan University, Aligarh, India

DOI:

https://doi.org/10.29121/shodhkosh.v7.i7s.2026.7652

Keywords:

Platform Governance, Algorithmic Accountability, Public Discourse, Digital Public Sphere, Content Moderation, Media Literacy, Platform Society

Abstract [English]

The rise of algorithm-driven digital platforms has changed how public discussions happen, are mediated, and contested. This study looks at how platform governance structures and accountability practices affect the quality, diversity, and inclusivity of public discourse on major social media platforms. Using a mixed-method approach, which combines a structured survey (N = 412) with an analysis of platform transparency reports, this study applies Habermas's Public Sphere Theory and Van Dijck's Platform Society framework. It examines the conflicts between the principles of commercial platforms and democratic communication ideals. Statistical tests, including Cronbach's Alpha reliability testing (α = 0.84), confirmatory factor analysis, chi-square tests, and independent-sample t-tests, show significant differences in how various demographic groups perceive algorithmic fairness (p < 0.001). Notably, users with lower digital literacy scores have much less trust in platform governance mechanisms (t = 4.67, p < 0.001). Additionally, content moderation practices are viewed as unfairly targeting marginalised communities (χ² = 24.31, df = 4, p < 0.001). These results challenge the common belief that algorithmic neutrality can exist within profit-driven platforms. This paper adds to the discussion on algorithmic accountability by suggesting a Governance-Discourse Alignment Index (GDAI) as both a theoretical and practical tool for assessing how well platforms follow democratic communication standards. The policy implications include required algorithmic impact assessments, independent oversight, and government-supported media literacy programs. The study also addresses its limitations and suggests directions for future research across different platforms and over longer periods.

References

Araujo, T., Helberger, N., Kruikemeier, S., & de Vreese, C. H. (2022). In AI we trust? Perceptions and attitudes about algorithmic news selection. Digital Journalism, 10(4), 619–639. https://doi.org/10.1080/21670811.2022.2031701

Bandy, J. (2021). Problematic machine behavior: A systematic literature review of algorithm audits. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW1), 1–26. https://doi.org/10.1145/3449148

Brown, M. A., Ludwig, C., & Munger, K. (2022). Echo chambers, rabbit holes, and algorithmic bias: How YouTube recommends content to real users. Political Communication, 39(5), 650–671. https://doi.org/10.1080/10584609.2022.2025397

Cabral, L., Geradin, D., & Kiriazis, N. (2023). The Digital Services Act: An economic and legal analysis. Journal of Competition Law & Economics, 19(2), 143–189. https://doi.org/10.1093/joclec/nhad001

Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research (3rd ed.). SAGE.

Gillespie, T. (2022). Content moderation, AI, and the question of scale. Big Data & Society, 9(2), 1–6. https://doi.org/10.1177/20539517221129749

Habermas, J. (1984). The theory of communicative action: Vol. 1. Reason and the rationalization of society (T. McCarthy, Trans.). Beacon Press.

Habermas, J. (1989). The structural transformation of the public sphere (T. Burger, Trans.). MIT Press. (Original work published 1962)

Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55. https://doi.org/10.1080/10705519909540118

Huszár, F., Ktena, S. I., O'Brien, C., Belli, L., Schlaikjer, A., & Hardt, M. (2022). Algorithmic amplification of politics on Twitter. Proceedings of the National Academy of Sciences, 119(1), e2025334119. https://doi.org/10.1073/pnas.2025334119

Jadhav, R.K., E, S., Kurulekar, M., Goel, P., Bhat, U., Upadhyay, M. (2026). Automated Editing Tools for Media Students: A Comparative Study. ShodhKosh: Journal of Visual and Performing Arts, 7(1s), 107–116. doi: 10.29121/shodhkosh.v7.i1s.2026.7075

Klonick, K. (2023). The governance of online speech: Platform constitutionalism and its discontents. Yale Law Journal, 132(6), 1601–1662. https://doi.org/10.2307/yjlf.132.6.1601

Noble, S. U. (2021). Algorithms of oppression: How search engines reinforce racism (Revised ed.). NYU Press.

Nunnally, J. C. (1978). Psychometric theory (2nd ed.). McGraw-Hill.

Poell, T., Nieborg, D. B., & Duffy, B. E. (2022). Platforms and cultural production. Polity Press.

Rashmi, C. P., & Jain, L. (2024). Visual Aesthetics and Cinematic Techniques in Indian Mythological Films: An In-Depth Exploration. International Journal of Media and Information Literacy, 9(2), 413-423.

Rashmi, C. P., Jain, M. L., Saroj, N., Bhavsar, R., & KP, Z. (2025). The Impact of Augmented Reality (AR) on Television Advertising: A Consumer Perspective. Advances in Consumer Research, 2, 4073-4085.

Ribeiro, M. H., Ottoni, R., West, R., Almeida, V. A. F., & Meira, W. (2023). Auditing radicalization pathways on YouTube. Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency, 131–142. https://doi.org/10.1145/3593013.3593988

Roth, V., & Pickard, V. (2022). Democracy without journalism? Confronting the misinformation society (Updated ed.). Oxford University Press.

Shin, D., & Biocca, F. (2021). Explicability, causability, and algorithmic transparency: The mediating role of perceived fairness in shaping user trust. Information, Communication & Society, 24(14), 2074–2094. https://doi.org/10.1080/1369118X.2021.1986070

Suzor, N. P. (2021). Lawless: The secret rules that govern our digital lives (Paperback ed.). Cambridge University Press.

Suzor, N. P., West, S. M., & Quodling, A. (2022). What do we mean when we talk about transparency? Toward meaningful transparency in commercial content moderation. Social Media + Society, 8(3), 1–13. https://doi.org/10.1177/20563051221123086

Van Dijck, J., Poell, T., & de Waal, M. (2018). The platform society: Public values in a connective world. Oxford University Press.

Zarouali, B., Brosius, A., Helberger, N., & de Vreese, C. H. (2022). Using a 'populist news diet' to explain exposure to and the effects of populist attitudes. New Media & Society, 24(3), 599–618. https://doi.org/10.1177/1461444820946455

Downloads

Published

2026-04-28

How to Cite

Sneh, A., Upadhyaya, D. B. ., Mishra, P., Kumar, S., & Jain, M. (2026). PLATFORM GOVERNANCE AND ALGORITHMIC ACCOUNTABILITY IN SHAPING PUBLIC DISCOURSE. ShodhKosh: Journal of Visual and Performing Arts, 7(7s), 40–52. https://doi.org/10.29121/shodhkosh.v7.i7s.2026.7652