BeeFiny Logo Visit the website

Google to Block Under-18 Singapore Users from Inappropriate Apps by March 2026

Published on: 04 October 2025

Google to Block Under-18 Singapore Users from Inappropriate Apps by March 2026

By the end of March 2026, the Google Play store will screen and prevent users estimated to be under 18 years old from downloading inappropriate apps, such as those providing dating services or with sexual content.

On YouTube, users under 18 will also be sent reminders to take a break, and be restricted from repetitively viewing certain content, such as those that idealise specific fitness levels or body weights, or promote social aggression, said Google on Oct 3.

These restrictions come with the roll-out of age assurance measures for app store users, as required by the Infocomm Media Development Authority (IMDA).

Age assurance measures refer to methods to ascertain a user's age - either by using government-issued identity documents, or by analysing facial age or online usage data.

"This isn't just about giving parents more tools. It's about our systems automatically providing an added layer of protection to ensure that every young person has age-appropriate experiences," said Google Singapore managing director Ben King at a Safer with Google event held in its Pasir Panjang office.

Google uses machine learning technologies to estimate a user's age by analysing search terms and content viewed online, among others.

The age assurance measures' implementation in Singapore will come after similar roll-outs in the United States, the United Kingdom and select markets in the European Economic Area.

IMDA's requirement - spelt out in the Code of Practice for Online Safety for App Distribution Services - will apply to Apple, Google, Huawei, Samsung and Microsoft as they operate stores or online portals for downloading applications.

The new code sets guard rails at the gateway to apps, which have come under fire for exposing children to all sorts of harmful content, including sexual and violent material and content linked to self-harm or cyber bullying.

The new measures are an attempt to rein in app stores in the same way that social media platforms are required to provide restricted account settings and tools for parents to manage their children's safety under Singapore's Code of Practice for Online Safety, which took effect in 2023.

Rule flouters risk being fined up to $1 million or blocked under the Broadcasting Act, which was amended in 2023 to rein in social media platforms and app stores.

By the end of March 2026, Google will also apply age assurance restrictions on users of other products, including Gemini, Google Maps and Search.

If a user is estimated to be under 18, an automatic filter that blocks explicit and potentially offensive content, including nudity, graphic violence and gore, will be turned on by default on Google Search.

On Google Maps, the Timeline feature, which creates a personal map of routes taken, will be disabled to limit data collection of those under 18.

On Gemini, users under 18 will not be able to create images with the AI assistant, and responses given are also fact-checked to flag conflicting or inaccurate information, with sources cited.

Age assurance complements Google's existing suite of tools to protect young users. This includes Family Link, which lets parents set screen time limits on their child's devices, block app downloads and locate the child.

Age assurance checks require users to have a Google account, so its machine learning technologies can analyse usage behaviour associated with the account.

Users estimated to be younger than 18 will be notified via e-mail that their settings have changed.

If an adult is wrongly estimated to be under 18, he or she can upload a government ID or selfie to turn off the restrictions.

It is not known if age assurance can still be carried out when users turn on incognito mode or if they use virtual private network technologies to mask their locations.

When asked, Google Singapore's head of government affairs and public policy, Ms Rachel Teo, said that the firm respects the value of anonymity that comes with the signed-out state and the ability to use internet services without being identified.

She also said that other protections are in place for all signed-out users. They include Search's "Safe Search Blur" setting that automatically obscures explicit images in search results, and YouTube's "Restricted" setting that blocks age-restricted videos.

"We encourage parents, when allowing their child to use our products, to either create a profile or account for their child's devices, so they have the benefit of the full suite of protections that are available for children," said Ms Teo.

"While there are tools to support parents, helping kids develop a healthy relationship with technology is important, and we need to ensure that parents build that foundational relationship and trust with their children."

She added that its age assurance technology has enabled the company to infer with a high degree of accuracy the age of its users in Europe.

Google's age assurance roll-outs in the US and UK have drawn complaints from users and experts over privacy and accuracy issues, as the measure relies on machine learning technologies to estimate a user's age based on online activities and browsing history.

At the Google event, in a panel discussion on preparing the youth for evolving online threats, the guest of honour, Minister of State for Digital Development and Information Rahayu Mahzam, said that she appreciates Google's efforts to roll out the age assurance measures.

"We are always outcomes-driven and we're agnostic as to how you actually do it. We want to make sure that the outcomes are such that the young ones are protected so that it really does go some way in making sure that we build that safe ecosystem for our young ones," said Ms Rahayu.

[SRC] https://www.tnp.sg/news/google-block-under-18-spore-users-inappropriate-apps-march-2026

Related Articles