TikTok app prevents children from sharing and uploading videos
Within a settlement agreed by the famous TikTok application management with the U.S. Federal Trade Commission, it is no longer possible for less than 13 years of age to participate in the application, whether by uploading videos or posting comments and even creating a profile and sending private messages.
The obligation to apply to this prohibition comes under the Federal Children’s Privacy Act, in which all technical companies are bound and set a minimum age of 13 years to participate in many services.
The new restriction applies to current and new users according to the date of birth entered into the application, and is certainly not a definite solution to the problem faced by the application.
The application department paid $5.7 million to settle the case against it for violating the children’s Online Privacy Protection Act COPPA. This law requires sites and applications to obtain parental consent for the use of their children’s services and under their supervision.
The application will delete all videos previously posted by children who have registered at the site under 13 years of age.
Under the age of 13, they will only be able to see what others are publishing but by filtering content.
Some users complained about deleting their clips even though they were over 13 years old, and here the site’s administration demanded that they send a copy of their identity cards to confirm this.