"waves of technological innovation" have gotten faster over time, "students might now find themselves learning skills in college that are obsolete by the time they graduate"
I was taught obsolete things in college in the early 90s. But FORTRAN wasn't the useful part of the class--problem-solving and broader language exposure was.
People focus on random technologies that are being used in class as being obsolete, but that's not the point of college. You can learn technologies on your own, and if you have trouble with that, maybe practicing it in college is a good idea.
Basically we're going to drill on technology-agnostic fundamentals for 4 years, and use a wide variety of technologies and languages as vehicles for that so you get a good breadth of experience.
People want more “real world usage” in college and school overall. Teach kids how to do taxes, teach engineers how to use X and Y software.
Well, in 10 years there’s a new software that does your taxes in another way, and plenty of laws have changed and there are new stuff to consider. And those software the engineers were taught, they are obsolete.
That’s why focus should be on getting people to a place where they themselves can acquire the skills needed to do those things by themselves.
@agressivelyPassive@beejjorgensen ehm, not really. Comprehensions [I didn't even know they are called as such for a long time] ale light years ahead of any abstraction provided by Fortran, and unfortunately also C (maybe not so C++, which is dangerous and versatile beast).
The core concepts of Python are two or three generations newer than that of Fortran.
I remember my microcontroller course professor telling us that if we just wanted to learn how to program assembly for microcontrollers, we could just pick up a book and skip the class.
Instead, he intended to teach us problem solving with microcontrollers.
The class was based around the Intel 8085 architecture, and this was in 2010. When I left the class, I started trying to make things using 8085s and assembly. These chips were so old, they needed external memory and flash storage to operate.
Anyway, I eventually learned about the larger microcontroller world; writing C; 32bit processors, real-time debugging, etc.
Understanding the fundamental goings on of assembly has been helpful, but it was only ever a building block.
If "learning 8085 assembly" only prepared you to program 8085 assembly and do exclusively that, you missed the entire point of higher education. Being able to generalize knowledge and applying it to other fields and specialisations is what is being taught. Not just following a tutorial.
Funny enough I retired a dozen Netware servers in the past year with the last one just a month ago. To say they were old and outdated was an understatement.
I thought my data structures class was useful. A few others were interesting. But other than that, no, Java development was not useful to anyone's daily life.
no, Java development was not useful to anyone’s daily life.
You've never worked with the US Federal Government. For every software problem the Government has, there is a Java application written to make your life a living hell trying to solve that problem. It's also even odds on said application requiring a version of Java which is about a decade old and it just mysteriously breaks with anything newer.
Funny story time, intentionally vague to shield identities:
I have a friend who was hired to teach a course at a local University for their new CS degree that had a focus on video games some while ago. He was a bit of an expert in a particular portion of the material that they needed, and when they started putting out feelers to find someone to teach the subject matter, everyone locally in the industry gave him the highest praise and said he was the man for the job. The University met with him and eventually selected him to teach, which he did for 3 semesters. After 3 semesters, they dropped him because he didn't himself have a college degree in what he was teaching (which was something he made very clear in the hiring process.)
He went into making games straight out of high school, he was basically there at the ground floor, self taught, acknowledged by everyone in the industry locally as a foremost expert in the field where they had him teaching, and they couldn't keep him because they couldn't have him teach when he didn't have a degree in the field. Without his having a degree their program couldn't be accredited. So... They wanted him to have a degree in a subject he was an originator of and without that degree they had to drop him.
He makes financial software now because the games industry was/is brutal and he wanted to see his family now and then. I've always found it hilarious that a University had to let him go because otherwise the snake wasn't eating its own tail and the ouroboros apparently can't have that.
Applies to many fields. Studied translation at university and, kudos to the head teacher, he kept saying we worked on current software for illustration but the point was to learn transverse skills to apply to whatever tools are trendy once on the market.
Turns out I work in a firm working outdated software older than my uni did. But I always agreed with the dude, we'll have to adapt or die as businesses.
I think the article should focus on how everyone or most people at work do keep up with the times. At least when I learned my teachers understood this issue and focused on providing a good theoretical foundation on which you can build on, the particular technologies are just examples of what's available at the time when you are being educated, it's not the actual focus of the education.
In an essay, Hyams shared his top concerns around AI — one of which is how technologies like OpenAI's ChatGPT will affect the job market.
"With AI, it's conceivable that students might now find themselves learning skills in college that are obsolete by the time they graduate," Hyams wrote in the essay.
"The higher the likelihood that a job can be done remotely, the greater its potential exposure is to GenAI-driven change," the researchers wrote, referring to generative artificial intelligence.
The CEOs thoughts on AI come as labor experts and white-collar workers alike become increasingly worried that powerful tools like ChatGPT may one day replace jobs.
After all, employees across industries have been using ChatGPT to develop code, write real estate listings, and generate lesson plans.
For instance, Hyams said that Indeed's AI technology, which recommends opportunities to its site visitors, helps people get hired "every three seconds."
The original article contains 463 words, the summary contains 148 words. Saved 68%. I'm a bot and I'm open source!