Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)NE
NextElephant9 @awful.systems
Posts 0
Comments 4
OpenAI Furious DeepSeek Might Have Stolen All the Data OpenAI Stole From Us
  • Knowledge distilation is training a smaller model to mimic the outputs of a larger model. You don't need to use the same training set that was used to train the larger model (the whole internet or whatever they used for chatgpt), but can use a transfer set.

    Here's a reference: Hinton, Geoffrey. "Distilling the Knowledge in a Neural Network." arXiv preprint arXiv:1503.02531 (2015)., https://arxiv.org/pdf/1503.02531

  • Stubsack: weekly thread for sneers not worth an entire post, week ending 8th December 2024
  • Hi, I'm new here. I mean, I've been reading but I haven't commented before.

    I'm sure you all know about how cheap labour is used for labelling data for training "AI" systems, but I just came across this video and wanted to share. Apologies if it has already been posted: Training AI takes heavy toll on Kenyans working for $2 an hour.