Seeing what kind of brainrot kids are watching, makes me think it's a good idea. I wouldn't say all content is bad, but most kids will get hooked on trash content that is intentionally designed to grab their attention.
What would be an effective way to enforce a restriction with the fewest possible side effects? And who should be the one enforcing that restriction in your opinion?
There is no real need to regulate kids on devices ... leave that up to the parents to figure out.
What we need is to regulate every major corporately owned social media company. Regulate and control them like they do for newspapers, magazines or television. Put them under complete regulatory control across the board so that we can regain some normalcy in public perception of reality and politics everywhere.
It's a pipe dream I know ... but in the meantime, no matter what anyone says or does ... if social media companies are not regulated, everything and everyone is going to hell in a hand basket.
I can't remember which article I was reading, probably one on Lemmy, but it said that we know social media algorithms are bad for people and their mental and physical health, that they are divisive, drive extremism, and just in general are not safe for society.
Drugs are regulated to ensure they are safe, so why aren't social media algorithms regulated the same way? Politicians not understanding the technical details of algorithms is not an excuse - politicians also don't understand the technical details of drugs, so they have a process involving experts that ensures they are safe.
I think I'm on the side of that article. Social media algorithms are demonstrably unsafe in a range of ways, and it's not just for under 16s. So I think we should be regulating the algorithms, requiring companies wishing to use them to prove they are safe before they do so. You could pre-approve certain basic ones (rank by date, rank by upvotes minus downvotes with time decay like lemmy, etc). You could issue patents to them like we do with drugs. But all in all, I think I am on the side of fixing the problem rather than pretending to care in the name of saving the kids.
I recall that some years ago Facebook was looking into their algorithm and they found that it was potentially leading to overuse, which might be what you're thinking of, but what actually happened is that they changed it so that people wouldn't be using Facebook as much. Of course people who are opposed to social media ignored the second half of the above statement.
Anyway, when you say the algorithms are demonstrably unsafe, you know you're wrong because you didn't demonstrate anything, and you didn't cite anyone demonstrating anything. You can say you think they're unsafe, but that's a matter of opinion and we all have our own opinions.
I recall that some years ago Facebook was looking into their algorithm and they found that it was potentially leading to overuse, which might be what you’re thinking of,
No, it was recent, and it was an opinion style piece not news.
but what actually happened is that they changed it so that people wouldn’t be using Facebook as much.
Can you back this up? Were they forced to by a court, or was this before the IPO when facebook was trying to gain ground and didn't answer to the share market? I can't imagine they would be allowed to take actions that reduce profits, companies are legally required to maximise value to shareholders.
Anyway, when you say the algorithms are demonstrably unsafe, you know you’re wrong because you didn’t demonstrate anything, and you didn’t cite anyone demonstrating anything. You can say you think they’re unsafe, but that’s a matter of opinion and we all have our own opinions.
You've grabbed the part where I made a throwaway comment but missed the point of my post. Facebook is one type of social media, and they use a specific algorithm. Ibuprofen is a specific type of drug. Sometimes ibuprofen can be used in a way that is harmful, but largely it is considered safe. But the producers still had to prove it was safe.
In the future we are going to look back on seeing children use iPads that directly connect them to the most sophisticated engagement and manipulation algorithms ever as something as horrid as a child smoking a cigarette, or doing any other drug
Now obviously this is an issue, but many of the suggested solutions are lacking.
Remember: the phones in our pocket are turing complete, any software solution can be undone by another software solution
Hardware flaws baked into chipsets will be inevitably exploited by the worst of humanity
What we need is a LEGAL framework to this issue
We need to see that allowing a full 5g 2.5ghZ portal to the unknown is simply absolutely harmful for a child to get there hands on without parental or educational supervision
I suspect it really should work like regulating a drug, allow more and more unsupervised compute and networking as the child ages
That way kids can still have dumb phones for basic safety and comms.
I suspect laws will be applied like alcohol within the home, to allow for family owned game systems and such
But lapses that lead to actual demonstrated harm such as mental illness leading to bodily harm or violence due to radicalization need to be treated as if a parent just fed their child alcohol without care. Or at least enabled them to it if it’s evident that they didn’t even try
Straight up it’s also a cultural shift
13-16 yr olds gaming at home under parental guidance, but not being bought significant personal compute since it would not be sold to them or for the purpose of giving to them
Usage in school where they get education on information technology and the harms and good it can do all fine and good , but seeing babies with iPads at the mall seen as badly as letting them smoke (and the secondhand smoke from all the brainrot leading to brainrotted adult)
I really am curious if anyone could demonstrate a link to the amount of access to compute and network bandwidth as a child ages, and the incidence of anxiety, social, or mood disorders.
One of the things I feel really thankful for is that the available compute and network I had access to grew up with me essentially, allowing me to generally see the harms of full scale manipulating social algorithms and avoid them.
I feel like my mental health has been greatly benefitted by staying away from such platforms.
This isn’t even like a social media only thing. There’s so many worse things a kid could get their eyes and ears on with the compute we just hand them Willy nilly
In the future we are going to look back on seeing children use iPads that directly connect them to the most sophisticated engagement and manipulation algorithms ever as something as horrid as a child smoking a cigarette, or doing any other drug
Depends on the game. Some of them, absolutely. Roblox is one that comes to mind, probably Fortnite as well. And one shouldn't even start on mobile games.
I think kids using regular social media is a result of there being no more sites for kids.
When I was a terminally online unloved kid told to go away whenever I breathed in my family's presence, I wasn't browsing Facebook and Twitter. I was playing on KOL, GirlSense, Nicktropolis, and games I pirated. I was creative so I also pirated art, movie, and game making software and sunk hours into hobbies. Back on the internet, I'd also play flash games and pirate TV shows and movies.
The sites we have today are aimed at toddlers. Sites for teens are just gone now. But does it even matter? Teenagers are treated like they're 18 the minute they're 13. It's shameful to like cartoons and videos games at 13. It's shameful to not have a job and live alone at 13. You're supposed to grind anti aging care, have a job, speedrun school, have children, have a car, and be rich at that time. Teens bully other teens for being teens. Society is ruined beyond repair. Banning children from using the internet will only breed more tech savvy minors who will still use the internet. It's way too late.
Absolutely not. Anything you put in is likely going to have privacy issues for both adults and children, and you forget how smart children are. I know we had firewalls and all kinds of shit when I was in school, and I was the person who taught everybody else how to bypass them in like five minutes. There is not a filter in the world you can put up that are going to keep children from the content that they actually want to look at.
Have you ever heard of the great firewall of China?
It's always a budget issue, not a technical issue.
We can block what we want with the right resources.
I think the better question is who has not heard of the Great Firewall of China, but it can still be bypassed. In fact, I've even spoken on a podcast with somebody from China by passing the firewall while we were talking.
I don't think that kids should be banned from social media. Instead they should be taught how to handle it in an individually and socially healthy way. Namely:
how to spot misinformation
how to spot manipulation
how to protect yourself online
how to engage constructively with other people
etc.
This could be taught by parents, school, or even their own peers. But I think that all three should play a role.
That's something most children can't understand it's basically adding up to an entire multi years school course what you're proposing and the way the education system is going in many countries id say good luck. It's not like as easy as saying oh little Charlie that's fake info because you should read xyZ scientific papers on climate change. Kids are fucking stupid even while going to school. People are constantly coming up with new ways to trick people and kids are above all the easiest to trick.
We're talking about children and teens. A 6yo eating bullshit is natural; a 13yo doing it should not. Please don't be disingenuous, stop oversimplifying = distorting things.
it’s basically adding up to an entire multi years school course what you’re proposing
Full stop here. That is not even remotely close to what I said, stop lying.
Me: there should be an agency like the FDA that brands news and other media with veracity labels according to guidelines we as voters agree on to prevent fake news and misinformation
Them: YOU CAN'T BECAUSE OF FREE SPEECH DIE HEATHEN DIE
All prohibitions do is create a space where kids are doing it, but without any discussion about the risks. It's the abstainance only education model, or the "war on drugs" model.
It doesn't work, especially when the "authorities" are doing it anyway, and they're not even quiet about it.
The biggest reason why not is that it requires the implementation of centralized tracking systems for everyone to confirm ID for accessing these services, which is a privacy nightmare and takes way too much agency away from individuals. If Reddit or something bans me for a stupid reason or because their broad brush modbots malfunctioned, I should be able to evade that ban with enough care and effort, and the government shouldn't help them make sure I can't. I should also have the ability to use social media pseudonymously without being subject to corporate tracking.
The other reason, of course, is that banning children from social media cuts them off from participating in society or having any sort of a public voice. That's fucked up too.
I don't think a ban is coming at the issue from the right angle. Social media misuse is fundamentally a problem of addiction, and we have a checkered past of causing harm when banning things. For a historical analogue, look at the Prohibition era of the United States.
Ultimately, bans for these things don't work because people will get around it anyway. And that's exactly when dangerous things happen. Using the Prohibition example again, people poisoned themselves trying to make illegal hooch because they were determined to drink anyway.
I think education is the answer. And I mean honestly, isn't education always the answer? But you've got to educate your kids about the content they're using. We've got to educate the parents about the dangerousness of unlimited access to screens. If people don't understand the danger, then they don't recognize the danger, and suddenly they've stumbled on danger.
I'm sure everyone has heard a story about a straight-laced kid who grew up with strict parents, and then at the first opportunity to party in college goes on a bender to destroy their life. Those kids' parents really did them a disservice by not preparing them for reality. If their only education on drugs and alcohol is "don't do them," then the child isn't really aware of the risks. They just see that everyone else is doing them and having fun, and then they go off the deep end before they realize how bad things are getting.
Social media's the same thing. The day your kid turns seventeen they'll have every chance to succumb to brainrot on their own volition. Without being informed of how or why that happens, there's nothing stopping someone from falling into any internet rabbithole.
I agree that the behavior of these companies to hook their users using the darkest psychological patterns is disgusting. It doesn't become any less disgusting once the user turns 17 though, and no framework is in place to prevent those teens from falling prey once they gain access.
Even if we all agree that a ban is warranted, my stance is that a ban alone isn't enough. It needs to be accompanied by education and harm reduction.
And it needs to be honest, scientific, and good-faith education. We don't need another DARE program demonizing something because misinformation can be more harmful than failure to educate
Education works for rationally thinking people. Kids these days are given access to digital devices and social media before they reach such a stage, and habits are formed from too young on
How does someone advance to "rationally thinking" without receiving education?
I don't disagree that habits picked up in childhood are more difficult to break. But I don't think it's a problem exclusively for children either.
Many of us are growing up with parents and grandparents with brainrot. And sure, maybe they would have been more susceptible as kids if the technology existed then. But we all would be better off with decent digital and media literacy.
Plus, who's giving the kids these devices anyway? It's usually the parents - who have been raised not to talk to strangers- giving their kids unfettered access to all the strangers of the world
I don't think kids should be barred from social media, since at its core, social media is just people talking and sharing things with each other.
The problem is not with the medium or generally who is using it, it's with the rate of consumption, poor parenting and poor moderation.
I also think it is an even larger problem to enforce in the first place, since it will destroy one of the good things about the Internet: anonymity. Seeing as the only way to truly enforce an age restriction is to require ID to be given to verify a user's identity. I'm not as super hardcore about my privacy like some parts of Lemmy are, but this is one thing I absolutely do not want to see happen.
Here, school OBLIGATE the parents to give them a smartphone. My teenager sister has one (she's special needs but even if she wasn't the kids need a phone with internet, to check school stuff like reunions and school fees)
Wow. These comments are really interesting. And they make me feel old.
A lot of them revolve around some version of the old trope of “everyone else is doing it” regarding access which I refuse to believe even the younger people on here think is a legit argument. Seriously. There’s also no shortage of comments that reinforce the idea that SM is kindof an addiction with the lengths some suggest be gone to for access or that access is some kind of right or necessity.
Generations past got by fine without all the tech and modern versions of SM, so we can dispose of all the drama surrounding any negative effects of limiting access. That’s strictly a social problem of haves/have nots. Being of an age to have kids myself and friends with kids I see all levels of social media access for both kids and adults. For some, it’s pretty much an addiction. They can’t keep their faces out of their phones watching tiktoks, shorts, snaps, or whatever. Their day revolves around that content.
A couple friends’ kids went down the rabbit hole and have had unfettered access to the internet for years. They all want their own channels, to be YT creators, buy shit like Stanley mugs, and do all the stuff they saw online, and all while being under 18. There have already been problems with inappropriate images and texts, adults online, and law enforcement. It’s a mess. Probably the more extreme end of things.
Others just stare at their phone all the time. Put it down for a minute, pick it right back up.
Our kids, who do have some social media access, complain that hanging out with their friends with more access to SM can often devolve into people just staring at their phones. Nobody wants to do anything else except watch “brain rot.” Their words, not mine. Seems to be a growing awareness among zoomers and later about SM and some of the problems with it. We have conversations (not us talking AT them) about all the stuff online, SM, etc. and the good/bad about it all.
Anyway, overall SM is a net negative. It seems to be more of a contradiction in terms, “social” media prevents actual social interaction with real people. The making of regulations limiting access is going to be incredibly difficult to effectively apply. Anyone who has tried to take an ipad away from a 4 year old is going to know that trying to do the same to kids and teens thanks to new laws is going to be a pain in the ass. I think most will attempt to work around it, which is disappointing in a lot of ways because parents don’t want to do the hard things - and it’s gonna be really hard seeing as the cat’s been out of the bag for a long time.
No, generations past did not get by fine without social media. They just had no solution to various problems. Two classic ones are eating disorders and marginalized minorities.
Yes it's a cancerous plague on society. It creates a false sense on identity and want. Young peoples minds are constantly bombarded by people telling them how they should think, feel and look
Australia did it somewhat right by banning minors until they are 16. However I feel this may create a stronger desire for some to join much like teens drinking and smoking bud cause it's forbidden. At 16 you'll only create a stronger desire to join. As such I feel like it should be pushed back to 20, you're no longer a teen and as a young adult the temptation to join and let your mind be swayed by bull shit may be less prevalent
We can't regulate half a dozen corporations, prohibiting algorithmic feed and targeted ads, so we will ban millions from using the apps with these features.
I don't think the government should be banning kids from using certain parts of the internet because of perceived harm. Kids need to explore the internet for themselves in order to learn first-hand about the dangers. I mean, the same argument for social media can be applied for everything on the internet. Search engine results are full of scams, are be banning search engines too? News sites are full of misleading information, lets ban news sites? So the only source of information is now from schools who can also be biased and in some places are just regurgitating government propaganda. In red states, schools are constantly telling LGBT+ kids that they are commiting "sins" and they are "mentally ill", and they might have very conservative parents with no sympathy. Are we really gonna stop kids from going online and seek support? If kids can't even be allowed to explore the digital world, how are we also allowing them to explore the physical world, where there are physical dangers?
In an ideal scenario, kids should be allowed to freely explore the internet, but should have parents that they trust to talk to in case they face any danger or harassment, so the parents can help them deal with it.
Eventually, kids are gonna grow up, and a kid with zero online experience their entire life suddenly gaining free access to all of the internet is a recipe for disaster.
Its like not teaching kids sex ed, then when they get old enough, they'll end up having unprotected sex.
Edit: Not to mention, social media ban is not very enforcible. Even in China, an authoritarian regime, is unable to stop kids from gaming, they just steal their grandparents IDs and play anyways. Do y'all really want a democratic country to suspend civil rights and start privacy intrusions?
It doesn't matter what you think. Kids will do what they want to do and that's that, so everything else is a question of how much time and money and posturing people want to do.
Yes. I hope the rest of the world will begin addressing the issue.
There's a wealth of information linking negative mental health to social media use (hell, read stories about QAnon), and I look at regulating social media among kids the same way we regulate cigarette smoking. Will it be perfect? No. But that doesn't mean it isn't worth doing.
IMO, even if it's "banned"/"prohibited" or whatever, kids will do whatever they have to do to enable them to do what they want to do. If they want to use social media, then they'll lie, cheat or otherwise manipulate the system into getting access.
With all that being said, maybe regulate social media for kids so that adults don't have enough access to prey on them. Beyond that, as long as they're not posting gore, or nudes or something equally inappropriate, let them do what they want.
Patents would be so familiar with this conundrum. Eventually you need to let your kids learn their own lessons and do as they wish. Your choice is whether you want to support your kid in what they want to do, or if you're going to try to impose rules on them, which has a nontrivial chance of alienating them, and they tell you nothing about their decisions, and won't come to you for help or guidance when things get rough.
As a son who was repeatedly alienated and is now estranged, I lived on the other side of such a situation. My story is my own, and I won't assume anyone else's situation.
If you're facing this decision as a parent, please be understanding and accepting of what your child decides, then stand by in case things go sideways. If your child in that scenario, I'm so very sorry for what's happening, and how these things inevitably end. Take care of yourself.
Your story seems to have far more going on than social media, and I question how it conflates something as extreme as estrangement with rules being imposed specifically regarding the subject at hand. Because there would be some serious personal issues on your end if you were abandoning family due to their limiting your social media access.
Yeah, kids can find their way around a lot of things if they really want. A parent’s job is to limit everyday harm within reason, and prevent the directly harmful ones like drinking and driving or whatever. Yeah, kids need rules too, you can’t not have them. Within reason.
As a parent there are a lot of ways you can prevent social media use, and with modern tech it’s pretty easy - along with multiple honest discussions about why social media use can have negative effects on people, and the fact that yes, they will indeed be given access to social media at some legit point that isn’t stupid like “When you are 18…”
Unfortunately there is no way to sufficiently regulate social media to prevent the access you describe. The corpos don’t want it because it would place the cost of designing, monitoring, and responsibility for such a system on them. People don’t want government interference in free speech and always bring up slippery slope arguments. And there would always be those, kids and content makers, trying to find their way around the rules. So it’s up to parents to, y’know, make rules and be parents.
I agree with the under-16 social media ban, but figuring out how and who implements is definitely going to be the hard part. Ideally it would be parents first, but that's been the status quo up to now and it hasn't worked. And as someone who has been "18+" online since I was 10... raising the age limit on the services themselves is only going to work to a certain extent. I'm very curious to see how this plays out.
Why only kids? We all need to be protected from social media.
I don't know how to suggest good national policy, but I think social media should have these:
controls on how far you can doom scroll
being able to opt out of algorithms by seeing things in time order and from (optionally) only white listed sources and allowing block lists in a variety of ways, etc, etc
heavy moderation of blatantly illegal content.
heavily curated advertising (or none at all, users can pay)
separation of political content (maybe a system of tagging so topics that are not of interested can be hidden......maybe this could be crowd sourced)
Strict control of data collection
The ability to delete/be forgotten
I don't know how propaganda or corporate interests can be excluded, but that would be ideal.
Controlling children's access is so insurmountably difficult, I don't even know where to start.
In real terms, I have no idea if this is a good move or a bad one. We'll know more in five years once the Aussie nerds can publish on the effects. I can't think of a compelling reason not to try it though.
Social media use is bad for everyone. Tech companies have spent billions of dollars refining and optimizing their platforms to maximize engagement and usage at the expense of all other considerations.
I've been researching the mental health effects of social media for an unrelated project I am working on. From an incomplete read of the research, social media use has a strong correlation with mental health issues. I haven't encountered anything peer reviewed that proposes a specific relationship between the two, but my personal (somewhat well informed) guess is that someone will find a link eventually. That's just where the research I've read seems to be headed.
I'd guess they probably have a symbiotic relationship. (Certain kinds of) Mentally ill folks use social media more than others, why or if that is anything more than a red herring is still to be determined, but I have read coverage of other research that suggests that social media might be destroying attention spans (though I haven't read that research myself yet).
Getting the political system involved in this effort is probably undesirable simply because elected officials seem to have entirely abandoned any pretense of using science to inform policy and are basically puppets for the oligarchy. Voting against the interests of their donors is unlikely.
I don't understand. Are you honestly claiming that you don't see any possible value in social media for teenagers? We could talk about people with eating disorders. We could talk about marginalized minorities who can find support in distant communities because where they live is ultra-conservative. We don't have to work hard to think of those examples, they're well documented, they're very real.
Of course you could argue that the harms outweigh the benefits, but to pretend that there are no benefits is just mind boggling.
What I'm saying is that we don't know the full scope of how social media affects developing minds. The harm might outweigh the benefits or not, we just don't know yet. I will be very interested to see the academic research on the effects the ban in Australia has on Australian children.
Social media has benefits for adults and children, but the ways in which these platforms influence thought and behavior creates significant problems. As an example consider Elon Musk's purchase of twitter and the subsequent effects it had on the American election and culture. On the one hand that is the reality we all live in and learning to adapt and compensate is a critical skill to teach our children, on the other there is no reason that things must be the way they are now.
If I could speak to a policy maker I would encourage them not to ban social media use for kids, for no other reason than bans (usually) don't work to address the problem they set out to solve and are easily circumvented online by motivated individuals. If lawmakers were interested in addressing the safety of children online, regulating social platforms would be a better starting point. Unfortunately though, tech companies have a lot more money to lobby against those kinds of initiatives than teenagers and the adults interested in protecting them.
Platforms could address the issues that lead to harm and create a beneficial tool for it's users, however there is little incentive for them to do so because the current system exists as the result of their efforts to maximize profit and furthers other agendas. (I don't mean that in a cynical anti-capitalist way, just that it is the nature of the way social media companies are structured and funded.) The research suggests that we might need to reevaluate how we integrate social media into our lives and build these platforms.
If nothing else barring children from using social media will present us an opportunity to get a better understanding of how social media effects them.
Email is social media. Zoom is social media. Suicide hotlines are social media. Kids absolutely need social media. Communication is a human right. Taking away the right to communicate means isolating children from support. How's this supposed to go for depressed, neurodivergent, or queer kids. This law gives abusive parents more control over their victims.
I don't believe it's something for the government to enforce. Any law that requires a nongovernment agency to collect identification means that identification is at risk of being stollen and means it will be used to track the person. If every person using the internet will have to prove their age everywhere, it's going to be a mess.
Whatever company has the worst security will have all the IDs stollen and used everywhere else. And I'm sure at first, it will be used so that criminals can frame others for their online crimes really easily.
I mean how do you prove the person using the internet is the one in the ID over the internet. It's easy enough to just use the picture on the ID and some "AI" to produce a fake image if they're going to require taking a picture of who's using it or something like that. This won't stop any minors from accessing information they shouldn't. The only way to do that is through education to make them realize they don't want to access that information and then give them the tools to avoid it. Not try to keep it from them. That just makes them want it more and to have to become criminals to do it. And further, if they're committing that minor crime just to do something normal it desensitizes them to more serious crimes because they don't understand the reasoning for them. Which is why making minor stuff that doesn't affect anyone but the offender a crime is always a bad idea.
I don't see it getting better before it gets much worse. Exploitation is the closest legal term. This is something we have literally never seen before as a species. It's like naming a whole new animal or landmark - nothing to be done in haste. So much of the discussion is useless because it's been framed in the favor of our abusers. In almost every facet, from the server to your phone - it's just trying to suck every bit of labor it can to feed the cycle.
But there could be a law that any phone tied to a number a minor possesses is locked down so it can't install the apps. It wouldn't stop web based, but apps seem to be a worse problem for various reasons.
It's not even so much the content that's the problem, it's the delivery mechanism, how it effects dopamine release, and how damaging those changes can be to a developing brain.
Its similar to the lootbox system that was regulated in various countries. Human brains will keep trying the next item in their feed because there's a chance something good shows up. If every post was good it would actually cause less addiction.
But a child has shit tier impulse control. They'll going to keep pulling the proverbial level forever, wading thru shit for the slightest dopamine hit. All the meanwhile still being influenced by what they scroll past.