Idk man. I just used it the other day for recalling some regex syntax and it was a bit helpful. However, if you use it to help you generate the regex prompt, it won't do that successfully. However, it can break down the regex and explain it to you.
Ofc you all can say "just read the damn manual", sure I could do that too, but asking an generative a.i to explain a script can also be as effective.
As I was learning regex I was wondering why the * doesn't act like a wildcard and why I had to use .* instead. That doesn't make me lose my critical thinking skills. That was wondering what's wrong with the way I'm using this character.
You are being unnecessarily pedantic. "A person can be wrong therefore I will get my information from a random words generator" is exactly the attitude we need to avoid.
A teacher can be mistaken, yes. But when they start lying on purpose, they stop being a teacher. When they don't know the difference between the truth and a lie, they never were.
No, obviously not. You don't actually learn if you get misinformation, it's actually the opposite of learning.
But thankfully you don't have to chose between those two options.
researchers at Microsoft and Carnegie Mellon University found that the more humans lean on AI tools to complete their tasks, the less critical thinking they do, making it more difficult to call upon the skills when they are needed.
It's one thing to try to do and then ask for help (as you did), it's another to just ask it to "do x" without thought or effort which is what the study is about.
So the study just checks how many people not yet learned how to properly use GenAI
I think there exists a curve from not trusting to overtrusting than back to not blindly trusting outputs (because you suffered consequences from blindly trusting)
And there will always be people blindly trusting bullshit, we have that longer than genAI. We have enough populists proving that you can tell many people just anything and they believe.