Instagram is boosting its content moderation appeals process and adding a new alert that will warn users who violate rules when their account is close to being deleted.

The Facebook-owned app, which has been hit by critics over not doing enough to stem the tide of bullying and misinformation, will display a history of the posts, comments and stories that Instagram has had to remove from their account — as well as why they were removed.

"We will now also remove accounts with a certain number of violations within a window of time," the company wrote in a blog post on Thursday. "Similar to how policies are enforced on Facebook, this change will allow us to enforce our policies more consistently and hold people accountable for what they post on Instagram."

APOLLO 11 ANNIVERSARY CELEBRATIONS: SOME OF THE BEST EVENTS

(REUTERS/Thomas White)

An appeals process will also be offered for content that was deleted for violating the tech giant's policies on nudity and pornography, bullying and harassment, hate speech and terrorist content, according to the blog post.

ORLANDO CANCELS AMAZON'S CONTROVERSIAL FACIAL RECOGNITION PILOT PROGRAM

"If content is removed in error, we will restore the post and remove the violation from the account's record," the Mark Zuckerberg-led social network notes.

The company this week expanded a test of hiding likes to a half dozen countries as a way to make the platform healthier and less pressurized.

CLICK HERE FOR THE FOX NEWS APP