farcical_continuation @programming.dev
Posts 1
Comments 0
MDN can now automatically lie to people seeking technical information · Issue #9208 · mdn/yari | MDN now providing LLM generated explainer text for code samples
github.com MDN can now automatically lie to people seeking technical information · Issue #9208 · mdn/yari
Summary MDN's new "ai explain" button on code blocks generates human-like text that may be correct by happenstance, or may contain convincing falsehoods. this is a strange decision for a technical ...
Seems pretty bad?
Next