logo
Marcos signs law strengthening childcare for kids ages 0-5

Marcos signs law strengthening childcare for kids ages 0-5

The Star14-05-2025
President Ferdinand Marcos Jr. has signed into law Republic Act (RA) No. 12199, which prioritizes early education, proper nutrition, and caring support for Filipino children ages zero to five. - Photo: PPA POOL
MANILA: President Ferdinand Marcos Jr. has signed into law Republic Act (RA) No. 12199, which prioritises early education, proper nutrition, and caring support for children ages zero to five.
Signed on May 8, RA No. 12199, or the 'Early Childhood Care and Development System Act,' the Presidential Communications Office (PCO) said the law was passed to 'implement the State's policy to safeguard and promote every child's right to holistic well-being, growth, and dedicated care.'
The new law repealed RA No. 10410, otherwise known as the 'Early Years Act (EYA) of 2013.'
Under RA No. 12199, the ECCD Council is assigned to care for children below age five, while the Department of Education (DepEd) oversees those aged five to eight, in line with the Enhanced Basic Education Act of 2013.
The ECCD will then be institutionalised through multi-sectoral and interagency collaboration at the national and local levels in government with other stakeholders.
'The law aims to reduce child mortality, support all areas of child development, prepare young children for formal schooling, and establish early intervention systems for those with special needs,' the PCO said in a statement.
RA No. 12199 lists the following as the ECCD Council's objectives:
- Reduce infant and child mortality rates, and subsequently eliminate preventable deaths, by ensuring that adequate health and nutrition programmes are accessible to young children and their parents and parent-substitutes, from the prenatal period throughout the early childhood years;
- Enhance the physical-motor, socio-emotional, cognitive, language, psychological, and spiritual development of infants and young children;
- Facilitate a seamless transition to, and ensure that young children are adequately prepared for, the formal learning system that begins at kindergarten;
- Establish an efficient system for early identification, prevention, referral, and intervention for the wide range of children with special needs below five years of age, using the Child Find System under Republic Act No. 11650;
- Reinforce the role of parents and parent-substitutes as the primary caregivers and educators of their children, especially those below five years of age;
- Improve the quality standards of public and private ECCD programmes through, among others, recognition and accreditation; and upgrade and update the capabilities of service providers and their supervisors through their continuing education, reskilling, and upskilling.
- Ensure that special support is provided in the delivery of the ECCD programmes and services for the poor, disadvantaged, and minority communities, and that children with disabilities are accommodated through the most appropriate languages and means of communication, and in environments that maximize academic and social development; and
- Employ teachers, including teachers with disabilities, who are qualified to manage young children with developmental delays and disabilities, and train professionals and staff who work at all levels of education.
Meanwhile, local government units are mandated to play a key role in implementing ECCD programmes through their respective ECCD offices. - Philippine Daily Inquirer/ANN
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

South Korea mandates breaks for outdoor workers in deadly heat
South Korea mandates breaks for outdoor workers in deadly heat

The Star

time3 days ago

  • The Star

South Korea mandates breaks for outdoor workers in deadly heat

A woman uses a piece of paper to shade herself from the sun as she waits to cross a road in Seoul on July 10, 2025, as a heat wave warning has been issued by the South Korean government. More than a thousand people have been affected by heat-related illnesses in South Korea, officials said on July 9, as the country recorded its highest early July temperature since records began. - AFP SEOUL: South Korea will require outdoor workers to receive at least 20 minutes of rest every two hours when apparent temperatures exceed 33 deg C from as early as next week, the Ministry of Employment and Labour said on July 11. The new rule – part of a revision to the occupational safety and health standards – was passed during a review by the Regulatory Reform Committee on July 11. It had been initially rejected in April and May over concerns that it would overburden small and medium-sized enterprises. The revision was made following mounting criticism from labour groups and a surge in heat-related deaths among outdoor workers during the relentless and intense heat. In recent days, more than 1,000 cases of heat-related illness have been reported – over twice the number recorded during the same period in 2024 – as record-high temperatures grip the country. According to the Korea Disease Control and Prevention Agency's heat-related illness emergency room surveillance system, 1,357 patients had visited emergency rooms by July 10 due to heat-related illnesses, with nine deaths reported. The majority of cases, comprising 28.7 per cent of the total, occurred at outdoor workplaces such as construction sites. On July 7, a Vietnamese day labourer in his 20s was found dead at an apartment complex construction site in Gumi, North Gyeongsang province. The authorities suspect the cause of his death to be a heat-related illness, as his body temperature was more than 40 deg C when found. Gumi also saw daytime temperatures reach as high as 38.3 deg C on the same day. On July 3, a Filipino seasonal worker in his 30s was found unconscious at a field in Yeongju, North Gyeongsang province. The worker was immediately taken to hospital, with medical authorities suspecting he collapsed due to heat-related illness. As South Korea continues to break summer heat records year after year, the government has in the past emphasised three basic principles for responding to heatwaves – water, shade and rest. The Labour Ministry has also recommended that all work outdoors be suspended during heatwaves – though such recommendations were not legally binding. But according to workers, such guidelines were not properly implemented. According to a study conducted by the South Korean Confederation of Trade Unions in 2024, around 15 per cent of outdoor workers reported not receiving water on site and only 20 per cent of respondents indicated that they had been able to stop work during heatwaves. 'The problem behind Korea's policies on working conditions for outdoor workers in summer is that they're mere recommendations and are not legally mandated,' sociology professor Lee Byoung-hun from Chung-Ang University told The Korea Herald. 'Korea needs a legislated work stoppage system based on the Wet Bulb Globe Temperature (WBGT) index, similar to California, to effectively respond to heatwaves.' The WBGT index mentioned by Prof Lee is a measure of heat stress in direct sunlight that considers temperature, humidity, wind speed and solar radiation and is used to assess the risk of heat-related illnesses during outdoor activities. 'Although Korea is seeing record-breaking summer temperatures every year, its protective measures for those working outdoors in the heat lag significantly behind other countries,' Prof Lee added, mentioning Greece as an example. Greece recently saw temperatures reaching as high as 40 deg C, prompting the Greek government to order a temporary suspension of outdoor labour and delivery services in parts of the country, according to the Associated Press on July 7. 'Mandatory rest periods, wearing cooling vests as well as the installation of cooling equipment should be mandated by the government to make sure such working guidelines are properly implemented,' Prof Lee added. - The Korea Herald/ANN

Content moderators unite to tackle trauma from extreme content
Content moderators unite to tackle trauma from extreme content

New Straits Times

time04-07-2025

  • New Straits Times

Content moderators unite to tackle trauma from extreme content

Content moderators from the Philippines to Turkiye are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Previously, I could sleep for seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now, I only sleep for around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss details of the content they are seeing. But videos of babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights suits in Kenya and Ghana, and in 2020 the firm paid a US$52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st-century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the Internet, keeping kids safe online," the Filipino worker said. "But we are not protected enough." Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content — and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkiye employed via Canadian-based tech company Telus, which also does work for Meta. She said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show — with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." Moderators have seen an uptick in violent videos. However, Telus said in its emailed response that internal estimates show that distressing material represents less than five per cent of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkiye are suing the company after being dismissed, they say, after organising a union and attending protests this year. Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. "With better conditions, we can do this better. If you feel like a human, you can work like a human," said Tunc. The writer is from Reuters

'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma
'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma

The Star

time04-07-2025

  • The Star

'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma

BRUSSELS: Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by terrorists, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a US$52mil (RM219.54mil) settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the Internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Scrolling trauma Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content – and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show – with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Rising pressure Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression". However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Protection protocols Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules – such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers – should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human." – Thomson Reuters Foundation

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store