YouTube removed a jump scare ad for The Nun after users complained and it was determined to violate “shocking content” advertising policies. The takedown perpetuates an ongoing history of ad problems for YouTube as it struggles to prevent offensive content.

One would expect the advertising for a horror film to be frightening, but Warner Bros. took it a step further with unsuspecting viewers. A short ad appeared with a black screen, over which a fake computer icon appeared to turn down the user’s volume. Suddenly, a loud bang was played and the demonic nun from the film appeared with a scream.

Many users did not appreciate the jump scare and after a number of complaints, YouTube pulled it from its site. According to YouTube’s ad guidelines, “promotions that are likely to shock or scare” are prohibited, but shocking factors can be judged by whether they appear realistic. This could explain why the ad was approved in the first place, although YouTube did not immediately respond for comment.

In the case of The Nun, Warner Bros.’ unholy character may not be real, but the frustration users felt was very real indeed.

As part of Alphabet, Inc.—the world’s largest seller of advertising—YouTube is under a lot of pressure to appease advertisers, but the jump scare ad shows a frightening truth about how consumer interest can be overlooked. In fact, the ad wasn’t removed until a tweet warning users about it reached well over 145,000 likes. YouTube responded to the tweet confirming that the ad was removed.

Last year, an ad for the game Mobile Strike was banned from YouTube after it was deemed sexist. The spot featured three plus-sized women in bikinis playing the game against one another. While praised for its use of diverse women, the UK’s Advertising Standards Authority (ASA) accused the ad of objectifying women through attire, performance and camera angles.

It’s impossible to keep everyone happy all the time, but YouTube seems to have a harder time than most—at least when it comes to advertisers. YouTube terminated over 270 accounts and removed 150,000 videos in the span of a week last November after brands like Adidas and Hewlett-Packard found their ads were displayed next to sexually explicit comments under videos of children.

Prior to these incidents, much of YouTube’s moderation efforts were automated. In December, YouTube CEO Susan Wojcicki announced plans to hire 10,000 employees to moderate and review policy-violating content on the site.

Thoughts? Continue the conversation at @alistdaily.