People who have turned to X for breaking news about the Israel-Hamas conflict are being hit with old videos, fake photos, and video game footage at a level researchers have never seen.
The Israel-Hamas War Is Drowning X in Disinformation::People who have turned to X for breaking news about the Israel-Hamas conflict are being hit with old videos, fake photos, and video game footage at a level researchers have never seen.
So censoring disinformation is bad because no one can know with 100% certainty that it’s actually disinformation or not? Isn’t there a lot of caveats here?
no one can know with 100% certainty that it’s actually disinformation
[Citation needed]
If video or pictures are being posted that weren't shot in Israel/Gaza or have already been posted in the past, while posters imply that they are from the current conflict, then we can say with 100% certainty that this is disinformation.
I did suggest there are caveats to this premise. Your example doesn’t solve for a consumer of information being able to prove that it is disinformation in every case, and even the methods used to prove something is disinformation can potentially be flawed. I’m not arguing that disinformation doesn’t exist, because it clearly does. I’m saying that you can very rarely prove with 100% certainty that something is misinformation. Then you end up with who gets to deem what is and isn’t disinformation? These people and institutions are fallible to corruption and things can potentially be worse than they are already but this is debatable hence these discussions
In this case, if a video was first posted online in 2015, and is being shared as part of this conflict now, but with a Hamas logo added to the corner, we can use a search engine to say with absolute certainty that the video is misinformation.
Yes this seems like a solid case that no one anywhere could argue with. My only point is when the line starts to get fuzzy how do you decide? And is it worth having a total hands off approach at the cost of things like this so that when things get emotional and susceptible to bias that there’s always the full range of information at hand to be able to find the truth? I’m not really for or against btw and I lean towards being at least proactive about this obvious shit
I would prefer not to even get close to the fuzzy line. I think a little bit of misinformation is something society can handle, and we are experiencing a flood of very easily proven wrong misinfo.
I think the Twitter Added Context feature was a pretty elegant solution, and fit into the "this is wrong, even harmfully so, but I can't prove its on purpose so maybe deleting it isn't the right move."
These are seriously issues that Lemmy is perfectly positioned to approach solving. Our community is eager to craft the perfect takes. We are so incredibly informed, and debate-ready. And the instances / communities / open source nature means we can literally make our own solutions to the ongoing problems of the net.
I hope you are right. It’s a problem and it needs some solutions. It appears equally as important to get it right and the only way I personally see that happening is by not having a knee jerk reaction to the problem and becoming properly informed and respectfully understanding where the other side is coming from before moving forward too brashly. It’s important to separate this side from trolls who are acting in bad faith. I would put Elon in half believing in it and half acting in bad faith because he is obviously half child. Not always a bad thing, it gives him the ability to be creative and adventurous but we all know sometimes children just want to smash the building down for reasons they don’t even understand.
The strongest point against censorship is that you don’t know what you don’t know, as in once the switch is flicked we no longer see what is being censored so once it’s out of sight the mechanisms that were put in place have to be air tight from corruption and unintentional bias.
Insisting that nothing can be falsified unless there is 100% certainty all the time is ridiculous. Who would even define what 100% certainty is supposed to be?
Because you come up some unattainable standard that you claim needs to be met, does not mean that anybody else has to agree with that standard.
You are free to believe in such nonsense, but otherwise it's just sophistry.
I’m sorry I’m having some trouble following your thread of conversation. In my very first comment I simply stated what the argument for absolute free speech is and then asked the open ended question of there is surely some problems/warnings (caveats) with this premise. You replied (citation needed) which kind of means the same thing.
In my next comment I wrote some of the problems with combatting disinformation, namely who gets to decide what and what isn’t disinformation (who gets to decide what is 100% certain or as close to truth as can be). I then mentioned that truth can be often be subjective so it’s not attainable.
In your last comment you inferred that I said I was insisting nothing can be falsified unless there is 100% certainty. I did not say this anywhere, to be honest I’m not even sure what that means.
I’m not sure if you have read my comments correctly. I do not want disinformation spreading and it’s obvious it’s extremely harmful to society. People like Elon who claim they are free speech absolutist are saying it in bery bad faith. Elon wants to criticise the government while tightly controlling information that aligns with his world view. This is a perfect example of what free speech in a capitalist society looks like and it’s riddled with disinformation and bias. My point is if you shift this control from a private citizen to the government will you always get a better result? I know in Elons case it seems obvious that you would but as we saw with Trumps purging of scientists from the government because they didn’t agree with his views on Covid I’m not sure.
TLDR: I’m not for or against anything, I hate disinformation and I’m simply stating there are caveats to free speech and controlling disinformation
Yes, you attempt to argue about hypotheticals. In practice fighting disinformation is about fact checks by various parties, including - but not limited to - OSINT researchers, scientists, independent journalists, news orgs, NGOs and government agencies.
Can some of these be corrupt? Absolutely!
Does it matter? Only if the piece of news in question cannot be corroborated by other parties.
The basis of news has to be facts that can be checked in order to be confirmed or debunked by other parties. There is no possibility to determine with 100% certainty if a piece of information is correct or not, because there is no objective observer who could make that kind of judgement. All we have - and always had - is simply the possibility to check and compare information with the underlying facts.
It is a messy process, but that does not mean that we cannot reach a consensus on which pieces of information are not supported by fact and that this is misinformation. Such consensus is part of the social contract.
Ultimately the consensus will never be supported by every involved party, but that does not mean that consenting parties aren't allowed to take appropriate action against the misinformation.