Facebook Inc. FB -1.76% this 12 months has made a flurry of recent guidelines designed to enhance the discourse on its platforms. When customers report content material that breaks these guidelines, a take a look at by The WSJ discovered, the corporate usually fails to implement them.
Fb permits all customers to flag content material for evaluation in the event that they suppose it doesn’t belong on the platform. When the Journal reported greater than 150 items of content material that Fb later confirmed violated its guidelines, the corporate’s evaluation system allowed the fabric—some depicting or praising grisly violence—to face greater than three-quarters of the time.
Fb’s errors blocking content material within the Journal’s take a look at don’t replicate the general accuracy of its content-moderation system, mentioned Sarah Pollack, an organization spokeswoman. To average the greater than 100 billion items of content material posted every day to the Fb platform, the corporate each evaluations person reviews and actively screens content material utilizing automated instruments, Ms. Pollack mentioned.