Facebook and Google have both responded to a firestorm around the proliferation of fake news during the US Presidential race by promising to crackdown on advertising against such content.
Google has said it will restrict advertising on pages "that misrepresent, misstate or conceal information about the publisher, the publisher's content, or the primary purpose of the web property," in a statement to Reuters - a change in policy.
It comes as the Washington Post found the top result when searching "final election result" was a link to a fake news site claiming that Donald Trump won both the electoral vote and the popular vote - he did not.
However, Google's latest move would only cut off the revenue from ads appearing on such sites through Google's Ad Sense.
Meanwhile, Facebook has faced intense scrutiny over accusations that it has not done enough to stop fake news from being circulated on the platform. Mark Zuckerberg has been forced to deny that it was an issue for the company and had no effect on the election result more than once since Trump won. Most recently he insisted "99 per cent of what you see on Facebook is authentic".
However, Facebook has now clarified its stance on advertising against misleading content, spelling out that it includes fake news.
“We have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance," Facebook said in a statement to the New York Times.
While publicly the social network has denied the claims, insiders at the company are apparently more concerned.
According to private chat records of senior staff seen by the New York Times, executives have questioned what the role of Facebook has been during the months of bitter campaigning.
And now, Buzzfeed reports a group of employees are forming an "unofficial task force" to raise their concerns which are contrary to the company's official stance.