We are migrating the content of this website during the first semester of 2014 into the new EUR-Lex web-portal. We apologise if some content is out of date before the migration. We will publish all updates and corrections in the new version of the portal.
Do you have any questions? Contact us.
Protecting children in the digital world
Great changes have been observed in the use of media by consumers, and particularly by minors. The latter are making increasing use of media through mobile devices, including online video games, which generates a growing demand for on-demand services on the internet. As a new phenomenon, social networks have gained huge importance, both for individual users and in societal terms. Many more changes are still to come. All these new developments offer many opportunities for minors, but they also create challenges with regard to their protection. This report summarises what has already been done in the area of protecting minors in the digital world and presents the further steps required to reinforce this work.
Report from the Commission to the European Parliament, the Council, the European Economic and Social Committee and the Committee of the Regions of 13 September 2011 on the application of the Council Recommendation of 24 September 1998 concerning the protection of minors and human dignity and of the Recommendation of the European Parliament and of the Council of 20 December 2006 on the protection of minors and human dignity and on the right of reply in relation to the competitiveness of the European audiovisual and online information services industry – Protecting children in the digital world [COM(2011) 556 final – Not published in the Official Journal].
This report outlines the measures put in place by Member States to protect children in online activities. It follows on from the 2006 Recommendation on protection of minors in audiovisual and information services, and the 1998 Recommendation on protection of minors and human dignity.
Illegal or harmful content
The report provides an overview of the initiatives taken by Member States aimed at combating discriminatory, illegal or harmful content online. It mainly concerns commitments and codes of conduct. For example, these provide for an appropriate label to be displayed on internet sites.
However, the level of protection assured by this type of initiative still varies from one Member State to another. Existing measures should be constantly monitored in order to ensure their effectiveness.
Furthermore, illegal or harmful content generally comes from other EU Member States or third countries. A coordinated approach at European and then international level would harmonise protection against this type of content.
The Digital Agenda for Europe provides for the installation of hotlines by 2013, enabling the reporting of offensive or harmful online content. These hotlines should benefit from co-financing from the Safer Internet programme. Furthermore, the Association of Internet Hotlines (INHOPE) is an effective cooperation tool for Member States and third countries. Notice and take-down procedures have also been put in place for internet service providers (ISPs) to take down any illegal content reported by a person using the hotline.
However, the Commission requests that Member States monitor their hotlines more closely. They are still not sufficiently known about by, and accessible to, Internet users and children.
Internet Service Providers (ISPs)
ISPs are requested to become more active in the protection of minors. The application of codes of conduct should be more widespread and closely monitored. ISP associations are encouraged to include the protection of minors in their actions and to ensure that their members are committed to this end. Moreover, greater involvement of consumers and authorities in the development of codes of conduct would help to ensure that self-regulation truly responds to the rapidly evolving digital world.
ISPs are encouraged to extend the application of the codes of conduct and to include the protection of minors in their mandates.
Social networking sites
Social networking sites have profoundly changed the behaviour of minors in the way that they interact and communicate with each other. These networking sites present many risks such as illegal content, age-inappropriate content, inappropriate contact and inappropriate conduct.
One of the ways detailed in the report for countering these risks may be the development of guidelines for providers of social networking sites. The Commission intends to increase the number of reporting points and to establish a well functioning back office infrastructure to be deployed on social networks.
Media literacy and awareness-raising
Member States are committed to increasing media literacy. There are several initiatives in this area, such as public-private partnerships and the EU kids onlineproject. However, although the integration of media literacy in schools has demonstrated positive results, universal coverage of all children and parents and consistency across schools and Member States remain significant challenges.
Access restrictions to content
Limiting minors’ access to content involves establishing age rating and classification of content. There are currently classification systems for audiovisual content in place which are considered to be sufficient and effective by some Member States, while others deem they should be improved.
Technical systems such as filtering, age verification systems and parental control systems can be useful, but they cannot guarantee complete restriction of access to content by minors. Subscribers are increasingly better informed about the existence of filtering and verification systems, and age verification software. However, Member States remain divided on the use, relevance (with regard to right to information and risk of censorship), technical feasibility and reliability of technical systems. Moreover, they highlight the need for transparency as regards the inclusion of certain content in a ‘black list’ and the possibility of its removal.
While most Member States see scope for improving age rating and classification systems, there is as yet no consensus on a pan-European classification system for media content. This report encourages reflection upon innovative rating and classification systems in the sector of information and communication technologies (ICT).
Audiovisual Media Services
The Commission notes that on-demand television services are lagging behind with regard to co/self regulation services aimed at protecting minors from harmful content and concerning the technical means for offering children selected access to content on the Internet. Age classifications and transmission time restrictions should be highlighted for these types of audiovisual media services.
With the exception of Germany, all Member States use the Pan European Games Information System (PEGI) concerning the protection of minors as regards video games. This report considers it appropriate to increase the number of awareness-raising measures with the aim of prevention, particularly in schools. Moreover, progress is still needed to ensure compliance with age classifications in the sale of video games and to extend the application of systems such as PEGI to online games.
- Directorate General for Information Society and Media - Protection of minors