PLDT and its wireless arm, Smart Communications, said that they blocked over one billion user attempts to access online child sexual abuse materials (CSAM). And that’s just since last November.
The telco giant was able to accomplish such a feat after they increased the capabilities of their cybersecurity solutions and child protection platform.
Moreover, the duo also blocked more 224,000+ VSAM URLs or websites as of March.
Related
“In support of our continuing commitment to child protection and to the global fight against online sexual abuse and exploitation of children (OSAEC), these milestone figures manifest our expanded capacity to both block illicit URLs and prevent end-user attempts to access CSAM,” said Angel Redoble, PLDT chief information security advisor.
Both PLDT and Smart have invested a hefty amount of Php2 billion to strengthen their cybersecurity operations center. Plus, a dedicated child protection platform that allows both telcos to predict, detect, respond, and prevent CSAM access. All while securing full compliance with privacy laws and consumer data protection.
Redoble said that their dedicated solutions utilize cybersecurity technologies and use dynamic intelligence data collected from Smart and PLDT’s partnerships with entities like the Canadian Centre for Child Protection and Internet Watch Foundation.
“Seeing such significant results, we are all the more encouraged to constantly keep abreast with global best practices and forge more alliances to further enhance our infrastructure for customer safety and solutions for child protection,” Redoble added.
PLDT and Smart also implement a child safeguarding policy that is deeply based on child rights and safety considerations in group-wide programs, practices, and technology solutions. These are some of the ways that both companies do to fulfill their social, environmental, and governance commitments.
To continuously bolster their cyber defenses, both companies plan to invest Php1 billion every year for it.
I’m too lazy to look it up, but in the case of IWF there are legit criticisms of how they run things like false blocks and blocks of sites they do not agree with for reasons other than illegal material. You may remember imgur being ‘down’ for months last year despite the site itself not being a haven for child abuse material. They also have blocked the internet archive and parts of of wikipedia before. Don’t get me wrong, I support the effort, but the implementation must be done carefully. This is just months after Facebook policy was shown to err on the side of considering material to be legal if not totally sure. It does not take a lot of effort to find horrific stuff on Facebook from anecdotal evidence I’ve seen. I think Tiktok is also a source of problematic material. Both sites are not held accountable for such material, and are not blocked. So, size lets you get away with breaking the rules. I hope the investment can provide effective blocking and possibly other ways to identify illegal material on the internet and possibly report those with a history of attempted access to law enforcement. Stop supply and stop demand.