Both of these are somewhat less bad than they were when I first noticed them, but they're still pretty bad. I am puzzled at how the latter even exists. I had thought that there were rules against just making a whole page about a neologism, but either I'm wrong about that or the "rules" aren't enforced very strongly.
The Universal AI University has implemented a novel admissions process, leveraging the Metaverse and Artificial Intelligence (AI) technologies. This system integrates optimization algorithms, crowd-generating tools, and visual enhancement technologies within the Metaverse, offering a unique and technologically advanced admissions experience for students.
Reflection (artificial intelligence) is dreck of a high order. It cites one arXiv post after another, along with marketing materials directly from OpenAI and Google themselves... How do the people who write this shit dress themselves in the morning without pissing into their own socks?
GPT-3 is a large language model that was released in 2020 by OpenAI and is capable of generating high-quality human-like text. [...] An upgraded version called GPT-3.5 was used in ChatGPT, which later garnered attention for its detailed responses and articulate answers across many domains of knowledge.
and of course, not a single citation for the intro paragraph, which has some real bangers like:
This process involves self-assessment and internal deliberation, aiming to enhance reasoning accuracy, minimize errors (like hallucinations), and increase interpretability. Reflection is a form of "test-time compute," where additional computational resources are used during inference.
because LLMs don’t do self-assessment or internal deliberation, nothing can stop these fucking things from hallucinating, and the only articles I can find for “test-time compute” are blog posts from all the usual suspects that read like ads and some arXiv post apparently too shitty to use as a citation
on the one hand, I want to try find which vendor marketing material "research paper" that paragraph was copied from, but on the other... after yesterday's adventures trying to get data out of PDFs and c.o.n.s.t.a.n.t.l.y getting "hey how about this LLM? it's so good![0]" search results, I'm fucking exhausted
[0]: also most of these are paired with pages of claims of competence and feature boasts, and then a quiet "psssst: also it's a service and you send us your private data and we'll do with it whatever we want" as hidden as they can manage
None of my acquaintances who have Wikipedian insider experience have much familiarity with the "Did you know" box. It seems like a niche within a niche that operates without serious input from people who care about the rest of the project.
"In The News" is apparently also an editor clique with its own weird dynamics, but it doesn't elevate as many weird tiny articles to the Main Page because the topics there have to be, you know, in the news.
cause I love the kayfabe linguistic drift for a term that’s not even a month old that’s probably seen more use in posts making fun of the original tweet than any of the shit the Wikipedia article says
The number of sources isn't really the issue; many of those are industry advertisements, such as blog posts on product pages, for instance. Out of the few that are papers, almost all are written exclusively by industry research teams — while that doesn't on its own invalidate their results, it does mean that there's a strong financial interest in the non-consensus view (in particular, that LLMs can be "programmed"). The few papers that have been peer-reviewed have extreme methodological flaws, such that there's essentially almost no support for the article's bombastic and extreme non-consensus claims.
For posterity: English Wikipedia is deletionist, so your burden of proof is entirely backwards. I know this because I quit English WP over it; the sibling replies are from current editors who have fully internalized it. English WP's notability bar is very high and not moved by quantity of sources; it also has suffered from many cranks over the years, and we should not legitimize cranks merely because they publish on ArXiv.
i'm more frustrated that NPOV has been forced into secondary positions behind reliable sources. just because a reliable source has said something does not justify its inclusion in an article where its inclusion would disturb the NPOV.
Its fine if you don't want to do the 'homework,' but op doesn't get to complain about the rules not being enforced on the notoriously democratic editable-by-anyone wikipedia and refuse to take up the trivial 'homework' of starting the rule violation procedure. The website is inherently a 'be the change you want to see in the world' platform.