Commentary
This article addresses a timely and increasingly important issue at the intersection of technology, psychology, and society: the psychological implications of personalised video recommendation systems, particularly their potential role in triggering or exacerbating addictive behaviours. By situating personalised search technologies within a behavioural and psychological framework, the study contributes to ongoing debates about the unintended consequences of algorithm-driven digital environments.
A key strength of the manuscript lies in its integrative perspective. Rather than focusing solely on technical aspects of recommender systems, the author foregrounds user vulnerability, behavioural cues, and dependency mechanisms. The discussion appropriately highlights how personalised algorithms, designed to maximise engagement and time-on-platform, may inadvertently reinforce maladaptive behavioural patterns, especially among individuals with pre-existing vulnerabilities such as internet addiction, pornography addiction, or obesity. This framing is particularly relevant for social and political psychology, as it raises ethical questions about responsibility, autonomy, and power asymmetries between users and platform providers.
The use of a narrative review methodology is suitable given the exploratory nature of the topic and the limited empirical literature directly linking personalised recommendation systems to addiction. The manuscript successfully synthesises findings from computer science, behavioural science, and mental health research, thereby offering a multidisciplinary overview. However, this approach also constitutes a limitation. The absence of systematic selection criteria and quantitative synthesis restricts the ability to draw firm conclusions about causality or effect magnitude. Future research would benefit from systematic reviews, experimental designs, or longitudinal studies that directly assess how exposure to personalised video recommendations influences addictive behaviours over time.
Another important contribution of the article is its emphasis on user awareness and agency. The recommendation that platforms provide transparency and options to disable personalised suggestions aligns with broader policy discussions on digital rights, algorithmic accountability, and informed consent. From a social–political psychology perspective, this suggestion underscores the need to balance technological innovation with public health considerations and individual well-being.
In conclusion, this manuscript serves as a valuable conceptual and critical contribution to the literature on personalised digital environments and addiction. While empirical evidence remains limited, the article effectively highlights plausible psychological mechanisms and social risks associated with personalised video recommendation systems. It provides a solid foundation for future empirical research and policy-oriented discussions aimed at mitigating the potentially harmful effects of algorithm-driven content delivery.
doi:https://doi.org/10.61727/sssppj/2.2023.44
