Fitness
YouTube alters algorithm to protect teenagers from harmful fitness content – Times of India
YouTube has decided that it will no longer recommend certain kinds of videos to teenagers, aged 13 to 17 years, that promote specific fitness levels, physical features or body weights, reported the Guardian.
This change comes after many experts warned that such content could be harmful if viewed repeatedly.
Although teenagers from the above-mentioned age group can still watch these videos, however, YouTube’s algorithms will stop pushing related content to them.
YouTube’s global head of health, Dr Garth Graham, explained about company’s decision and said, “As a teen is developing thoughts about who they are and their own standards for themselves, repeated consumption of content featuring idealised standards that starts to shape an unrealistic internal standard could lead some to form negative beliefs about themselves.”
These new guidelines will target videos that idealise some physical features, promote certain fitness routines or encourage social aggression. YouTube’s advisory committee stressed that while a single video might be harmless, repeated viewing could be problematic.
Now implemented worldwide, including in the UK and the US, this decision aligns with the UK’s recent Online Safety Act. The act mandates tech companies to protect children from harmful content and consider how their algorithms could expose minors to damaging material.
Dr Allison Briscoe-Smith, a clinician and YouTube adviser, noted, “A higher frequency of content that idealises unhealthy standards or behaviours can emphasise potentially problematic messages – and those messages can impact how some teens see themselves.”
This change comes after many experts warned that such content could be harmful if viewed repeatedly.
Although teenagers from the above-mentioned age group can still watch these videos, however, YouTube’s algorithms will stop pushing related content to them.
YouTube’s global head of health, Dr Garth Graham, explained about company’s decision and said, “As a teen is developing thoughts about who they are and their own standards for themselves, repeated consumption of content featuring idealised standards that starts to shape an unrealistic internal standard could lead some to form negative beliefs about themselves.”
These new guidelines will target videos that idealise some physical features, promote certain fitness routines or encourage social aggression. YouTube’s advisory committee stressed that while a single video might be harmless, repeated viewing could be problematic.
Now implemented worldwide, including in the UK and the US, this decision aligns with the UK’s recent Online Safety Act. The act mandates tech companies to protect children from harmful content and consider how their algorithms could expose minors to damaging material.
Dr Allison Briscoe-Smith, a clinician and YouTube adviser, noted, “A higher frequency of content that idealises unhealthy standards or behaviours can emphasise potentially problematic messages – and those messages can impact how some teens see themselves.”
Continue Reading