On the internet, children are rarely safe, but recently online platforms are attempting to improve minors’ security. Instagram moved to make all underage users use private accounts, Tiktok removed comments from underage users. As one of the most important online platforms in existence, it's time for Google to take a stand.
Google search results can be removed
In a new announcement, Google revealed that people under the age of 18 will be able to remove themselves from Google search results. After decades of unfettered image searches, the company will finally tackle the privacy of children online.
Requesting the removal of an image won't entirely remove it from the web. However, the new move to protect children will result in the Google search engine hiding the image in question. This change will occur worldwide as the compost wants to offer “consistent product experiences and user control”.
Ads are changing for minors
On top of the ability to remove certain images from Google search, the company is also modifying ad tracking. The company came under hot water for ad tracking minors on YouTube. However, no drastic changes were made to their flagship search engine.
The company is no longer serving ads to kids based on age, gender or interests. Furthermore, changes to ad tracking means that children will no longer receive adverts for things grossly outside of their age range. Instead, safeguards will stop “age-sensitive ad categories” from displaying to minors.
Google safesearch and YouTube is changing
Google Safesearch has always been an important part of web surfing for those who are underage. After a couple of decades, it's finally being refined for the everyday audience. For those with Google accounts, safesearch will be on by default for anyone under the age of 18. Previously, safesearch only limited users under 13-years-old.
On the YouTube front, under-18s will now automatically receive break reminders to stop binge viewing. Additionally, autoplay is now off by default.