TikTok has announced new safety features which will allow parents to control the content their children see on the platform.

The video-sharing social media app said its new family safety mode will link a parent's account to their child's.

The feature will enable parents to control the amount of screen time allowed on TikTok, who can directly message the account and restrict types of content which appear in the feed of their child's account.

The announcement comes as debate continues around the impact of the internet and social media use on people, particularly younger users.

TikTok has become increasingly popular among young people over the last 18 months and is based around users sharing short videos.

It was one of the most downloaded apps of 2019.

The service first introduced screen time management tools last year but said this update will see prompts about screen management will now appear in user's feed.

The social media platform's head of trust and safety in Europe, Cormac Keenan, said the app had worked with some of the platform's most popular figures to introduce the prompts.

He said they would "remind our community to be aware of the time they spend on TikTok and to encourage them to consider taking some time out".

Writing in a blog post announcing the new safety features, Mr Keenan said: "When people use TikTok, we know they expect an experience that is fun, authentic, and safe.

"As part of our ongoing commitment to providing users with features and resources to have the best experience on TikTok, we are announcing family safety mode, a new feature to help parents and guardians keep their teens safe on TikTok.

"We will keep introducing ways to keep our community safe so they can stay focused on what matters to them - creating, sharing, and enjoying the creativity of TikTok's community."

On Monday, fellow social media giant Facebook published a set of proposed guidelines for regulators on suggested "new rules for the internet".

The proposals included encouraging more independent oversight on content moderation, as well as provisions to protect freedom of speech.

Former deputy prime minister Nick Clegg, now Facebook's head of global affairs and communications, said the company "wants to work with policymakers" on rules "that keeps the internet safe".