KATIE TRINH WRITES – Norwegian newspaper, Aftenposten, posts Nick Ut’s photo of “Napalm Girl” on their Facebook page. However, within hours, Facebook removes the photo because the image violates the company’s standards on nudity.
Pulitzer Prize recipient, Nick Ut, photographed the Vietnam war with hopes of stopping the war by depicting just how horribly it affected people. “Napalm Girl” shows a girl, nude, fleeing from her village which had just been bombed. While there is obvious nudity in the photograph, the message delivered is anything but sexual. It is disturbing. It is heartbreaking. It is iconic.
After receiving backlash for removing the photo, Facebook reinstated the image with a statement. “An image of a naked child would normally be presumed to violate our community standards, and, in some countries, might even qualify as child pornography,” Facebook said in a statement on Friday, and then continued, saying, “In this case, we recognize the history and global importance of this image in documenting a particular moment in time.” The ease that Facebook has to control the media has caused many to become wary. With numerous media companies relying on Facebook to relay information, some are concerned that “Facebook may hold too much sway over how information is distributed.”
In August, Facebook laid off their Trending Topics team and has been relying only on algorithmic decision-making. This means that the “Napalm Girl” photo was flagged by a combination of algorithms and combed through by human moderator to be removed from the site. Whether it was a computer or a human moderator, the company itself is still to blame for the censorship.
This incident goes further than a one-time occurrence. Norwegian author, Tom Egeland, posted the history of warfare along with seven photos, one of them being the “Napalm Girl” on Facebook . Not only was the photo removed from the site but Egeland was also barred from posting anything on Facebook for 24 hours which was later extended to a three day ban. Editor-in-chief of Aftenposten, Espen Hansen, took a stand on behalf of Egeland and published the photo on the newspaper’s own Facebook page. Facebook sent a courtesy email to Hansen asking him to remove the photo but ultimately removed the photo themselves without any warning. Norway’s Prime Minister , Erna Solberg, and other cabinet ministers then posted the “Napalm Girl” on their Facebook pages. Their posts were also censored.
The Norwegian media along with their supporters are outraged at Facebook’s power to control the media. People have also looked at Nick Ut and Phan Thi Kim Phu, the naked girl in the “Napalm Girl” photo, for their comments on the situation. While Phan Thi Kim Phu remained silent, Nick Ut hopped on the bandwagon of posting on Facebook his dissatisfaction with the social media giant for censoring his iconic photograph.
Did Facebook right its wrong by reinstating the “Napalm Girl”? Should these decision-making algorithms be trusted to filter out “inappropriate” content? Today’s technology is so advanced, humans aren’t needed to do jobs that machines now have. It’s much easier to blame a computer than to file a report on a person when a mistake happens. However, the bigger question that arises is whether Facebook should even have a filter. Whether inappropriate content pops up on my feed or not, who gave Facebook the power to step in and block that content on my behalf?