Skip Navigation

Neither the devil you know nor the devil you don’t

Our path to better working conditions lies through organizing and striking, not through helping our bosses sue other giant mulitnational corporations for the right to bleed us out.

34
34 comments
  • I've been disappointed to see Doctrow's reaction to the AI industry, to say the least. He's spent so much time relentlessly campaigning against intellectual property that he apparently cannot imagine anything worse than intellectual property winning anything ever. I don't think he's a big picture guy, I think the internet just really likes him because at the end of the day he was popular on slashdot and he tells people that piracy is awesome.

    • Doctorow's had some pretty bad takes, honestly, for all I agree with him on some things. His review of Naomi Klein's Doppleganger -- a book which explores conspiracy theories and how they capture people -- reduces the entire book, incorrectly, to his own political soapbox:

      Fundamentally: Klein is a leftist, Wolf was a liberal.

      This is, frankly, juvenile. Not every bad thing in the world can be mapped on to one's particular soapboxes. Treating GenAI as Good because Copyright is Bad is exactly in character for him.

      (Also Cory gets smug about releasing his novels CC, as if all working writers can do that, but I'm sure it helps the family finances that his wife is an executive at Disney. It's great to use money from one of copyright's biggest monsters to act self-righteous about other people trying to make a living.)

      • his wife is an executive at Disney

        Wait. What? Holy shit, every good goddamn thing he's ever released regarding copyright and intellectual property needs a big bold disclaimer about that in front then.

        Fucker's talking out both sides of his mouth.

        https://en.wikipedia.org/wiki/Alice_Taylor_(businesswoman)

        In 2017, Disney acquired Makie Labs technology and personnel for an undisclosed figure.

        In keeping with the strategic acquisition, Ms Taylor is now the Director, StudioLab at The Walt Disney Studios. In that role she is responsible for ensuring that Disney continues to invest in the intersection between online tech and content distribution.

        I'm sorry, in non-executive speak, doesn't that heavily imply that she at least oversees some work on DRM? Any content distribution method Disney touches with a 40-foot pole is going to have DRM methodology.

        Motherfucker.

    • Yeah, I think his ideological commitment to "all intellectual property rights are bad forever and always amen" kind of blinds him to the actual issue here, and his proposed solution is kind of nonsensical in terms of its ability to get off the ground.

      More broadly, (ie not just in relation to Cory Doctorow), I've seen the take floating around that's like "hey, what the heck, artists who were opposed to ridiculous IP rights restrictions when it was the music industry doing it are now in favor of those restrictions when it's AI, what gives with this hypocrisy?" which I think kind of... misses the point?

      A lot of artists generally are in favor of using their work for interesting collaborative stuff and aren't going to get mad if you use their stuff for your own creative endeavors. This is why we have things like Creative Commons. The actual things artists tend not to like are things like having their work used for commercial purposes without permission and/or having their work taken without credit. (This is why CC licenses often restrict these usages!) With that in mind, a lot of the artist outrage over AI feels much more in line with artists getting mad about, say, watermark-removal tools, or people reposting art without credit, than it does with the copyright battles of the 00s. (You may remember one of the big things artists were affronted by about AI art was the way it would imitate an artist's signature, because of what that represented.)

      In this case, artists are leaning on copyright not out of any particular ideological commitment but just because it's the blunt instrument that they already have at their disposal. But I think Cory Doctorow's previous experience in "getting mad at the MPAA" or whatever kind of forces him to analyze this using the same framing as that issue, which doesn't really make sense in this case. And ironically saying "copyright shouldn't count for AI" aligns him with the position of the MPAA so it really does feel like a "live long enough to see yourself become the villain" scenario. :/

      • More broadly, (ie not just in relation to Cory Doctorow), I’ve seen the take floating around that’s like “hey, what the heck, artists who were opposed to ridiculous IP rights restrictions when it was the music industry doing it are now in favor of those restrictions when it’s AI, what gives with this hypocrisy?” which I think kind of… misses the point?

        I've noticed that too, on occasion. I think the "hey whoa, artists are copyright maximalists now?!" takes tend to miss how artists are coming from concerns about what is morally right and how they can make a living, not copyright as a principle. The latter is, at most, a tool to achieve the former.

        With that in mind, a lot of the artist outrage over AI feels much more in line with artists getting mad about, say, watermark-removal tools, or people reposting art without credit, than it does with the copyright battles of the 00s.

        This says it better than I was going to.

    • I'm no lawyer, I don't even play one on TV, so upfront apologies if I'm hanging my ass out.

      That said, it sounds to me like Doctorow might have a point here. Suppose Universal et al. gets a precedent-setting ruling and slays OpenAI. LOL, LMAO even, but then what? What's to keep the current entertainment cartels from making deals with Microsoft or the husks of the AI companies to rev up their own (now) fully legal and licensed bullshit engines? The only winning legal play is Giant Asteroid.

      • He says some pretty ignorant stuff in this post that undercuts his argument, though:

        Here's the problem: establishing that AI training requires a copyright license will not stop AI from being used to erode the wages and working conditions of creative workers. The companies suing over AI training are also notorious exploiters of creative workers, union-busters and wage-stealers. They don't want to get rid of generative AI, they just want to get paid for the content used to create it. Their use-case for gen AI is the same as Openai's CTO's use-case: get rid of creative jobs and pay less for creative labor.

        This isn't hypothetical. Remember last summer's actor strike? The sticking point was that the studios wanted to pay actors a single fee to scan their bodies and faces, and then use those scans instead of hiring those actors, forever, without ever paying them again. Does it matter to an actor whether the AI that replaces you at Warner, Sony, Universal, Disney or Paramount (yes, three of the Big Five studios are also the Big Three labels!) was made by Openai without paying the studios for the training material, or whether Openai paid a license fee that the studios kept?

        The writers' and actors' strikes, in an overwhelmingly unionized workforce, did not say "hey, we as a labor force want a cut of the dirty GPT lucre". Instead, they said not today, satan to studios working with GenAI at all. And won. Those writers and actors, who are overwhelmingly huge supporters of copyright and moral rights, defeated the rich assholes at the Big Five not by throwing up their hands and giving all their creative output to the glurge machine, but by unionizing and painful, hard-won solidarity.

        Whether SAG-AFTRA and the AFM (or non US equivalents) can organize as effectively for musicians and lyricists is unclear. But Cory, who claims to be a leftist, is defaulting to "you as a musician should work for free" and not "you as a musician should organize to counter the power of capital", and that's about as leftist as Grimes posing with The Communist Manifesto.

      • Suppose Universal et al. gets a precedent-setting ruling and slays OpenAI. LOL, LMAO even, but then what? What’s to keep the current entertainment cartels from making deals with Microsoft or the husks of the AI companies to rev up their own (now) fully legal and licensed bullshit engines?

        I think it remains to be seen if you can train a base model without something as big as common crawl. A precedent that Universal needs to give you permission could also be a precedent that everyone must give you permission for you to scrape them.

34 comments