So there is nothing big tech can do, for you to trust them? Ok then. But I wonder why you made that huge list of changes if you wouldn’t trust them anyway.
Except cigarettes kill individuals. Widespread disinformation spread on social media that appeals to fear and anger gets amplified- leading to a breakdown in society/democracy- affecting far more than individual addictions. Societal cancer.
Societal cancer it may be. But it’s still comes into free speech issues. You can’t have Congress legislate that certain information is no longer allowed to be spread.
I understand the extreme difficulty in finding accurate laws to limit social media. But I point to the downside of doing nothing.
Myanmar is case exhibit 1. Disinfomation via Facebook led to hyping fear and anger towards Rohingya muslims and then accelerated genocide on that population.
Government interference in what something like what Facebook can allow to be printed can backfire, depending on who is in charge.
However, if they edit for content they are not like the telephone company. They need to be subject to liability laws.
And they need to be examined on the basis of monopolistic practices.
This is really tricky. Because then moderation would have to be extremely tight in order to avoid lawsuits. I read this on the subject and thought it was interesting…
Reforming Section 230 would be highly controversial. Even some policy organizations like the Electronic Frontier Foundation and Fight for the Future, which heavily scrutinize tech companies, have argued that stripping this law away could entrench reigning tech giants because it would make it harder for smaller social media platforms with fewer content moderation resources to operate without facing costly lawsuits.
Haugen seemed to understand some of these nuances in her discussion of 230. She proposed for regulators to modify Section 230 to make companies legally liable for their algorithms promoting harmful content rather than specific users’ posts.
“I encourage reforming Section 230 decisions about algorithms. Modifying 230 around content — it gets very complicated because user-generated content is something companies have less control over,” said Haugen. “They have 100 percent control over algorithms.”
Many are claiming it. I didn’t say it is because I am not familiar enough with their practices or antitrust laws. However, since many are claiming it is, I believe it should be looked into further.
Im pretty fascinated by the different takes from conservatives. But yeah- generally I don’t think any legilslation will pass because the main concern for Dems are algorithms that promote hate and divisiveness, and the main issue for Conservatives is to allow for more freedom of expression- essentially allowing for more misinformation to be disseminated.
Perhaps there is room on both sides for increasing privacy rights.
But the point is that companies like Facebook are moderating, or editing, what is being allowed to be printed, and are doing it on a subjective ideological basis. That is why they are responsible for the content.
Another company that did not moderate might not be held to the same standard.