Lawyer and Supreme Court advocate Karuna Nundy recently tweeted about a similar case: "Every time you reshare a 'school girl viral video,' you are digitally assaulting a child. Stop. Report. Delete."
However, a second, more disturbing thread involves a different clip—one that cybersecurity experts argue is "morphing." This video allegedly shows a minor in uniform in a vulnerable state, though fact-checking organizations like Alt News and Boom Live have flagged most versions of this clip as either old (dating back to 2022) or digitally manipulated using deepfake overlays.
Yet, the platforms struggle. Instagram Reels and WhatsApp forwards are not regulated by human eyes; they are propagated by algorithms that reward "shares." The most depressing act is the third. By day two, the gravity of the situation dissolves into memeification. The "Delhi school girl" becomes a template for unrelated jokes about school life, exams, or even political satire. The specific suffering of the individuals in the video is erased, replaced by a hollow shell of a keyword used for engagement farming. The Ripple Effects: Real-World Consequences While the internet moves on in 48 hours, the children involved do not.
Under the and the Juvenile Justice (Care and Protection of Children) Act, 2015 , sharing any video that identifies a minor victim (or even a minor perpetrator in a gendered context) is a non-bailable offense.
As one Reddit user poignantly wrote in a now-locked thread: "We are all asking for the video link in DMs while pretending to be outraged on timelines. We are the virus."