When discussing online safety for adolescents, it is tempting to focus on technological solutions first. Artificial intelligence promises early detection of risks, gamified applications aim to raise awareness, and digital platforms claim to empower young users. However, at the beginning of the SafER-Web project, we deliberately chose a different path. Before building tools, we focused on understanding needs. The project aims to strengthen digital readiness among adolescents aged 12 to 18 by addressing risks such as cyberbullying, grooming, sexting, and phishing through innovative AI-supported solutions. Yet innovation without reflection can easily become detached from the realities of schools and young people’s everyday digital lives. At Carl von Ossietzky Universität Oldenburg, our work within SafER-Web centers on ensuring that technological development is grounded in educational, ethical, and methodological rigor. This is why Work Package 2 – the Ethical, Policy and Methodological Framework – plays such a central role in the overall project design. Rather than treating ethics as an afterthought, we position it as the foundation upon which all subsequent technical components are built.
Working with adolescents in sensitive areas of online behavior requires particular care. Digital risks are rarely isolated technical problems. They are embedded in social relationships, peer dynamics, emotional development, and institutional contexts. An AI-based self-assessment tool, for example, must do more than identify potential risk patterns. It must avoid stigmatization, protect privacy, and support reflection rather than surveillance. It must respect the autonomy of young users while acknowledging their vulnerability. Designing such a system demands interdisciplinary collaboration and a deep awareness of both data protection requirements and psychosocial implications. Our background in computing sciences, machine learning, and natural language processing allows us to contribute technical expertise. However, equally important is our experience in interdisciplinary research and responsible AI development.
We analyze how ethical standards, European policy frameworks, and school-based realities intersect and examine questions of data minimization, transparency, explainability, and fairness in AI systems. We also consider how digital tools can be meaningfully integrated into existing school structures without increasing workload or creating unrealistic expectations. A realistic approach means acknowledging constraints. Schools operate under time pressure, teachers manage diverse classrooms, and students navigate complex social environments. Any digital intervention must therefore be feasible, understandable, and adaptable to different educational contexts across Europe. SafER-Web does not aim to replace educators or automate responsibility. Instead, it seeks to provide supportive instruments that enhance awareness, critical thinking, and informed decision-making among adolescents. By investing significant effort in the ethical and methodological groundwork at the beginning of the project, we ensure that later developments, such as the AI-powered self-assessment tool and the gamified learning components, are not isolated technical products but carefully aligned educational instruments. This sequencing reflects our conviction that responsible innovation requires patience, reflection, and structured analysis before implementation.
Online safety cannot be addressed through reactive measures alone. At the same time, proactive solutions must remain grounded in reality. SafER-Web combines technological ambition with ethical responsibility and educational sensitivity.
At Carl von Ossietzky Universität Oldenburg, we are committed to ensuring that this balance remains at the heart of the project. Because when digital tools engage with young people’s lives, realism is not a limitation. It is the prerequisite for trust.



Leave a Reply