Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)HA
Hackworth @lemmy.world
Posts 2
Comments 190
House Democrat is proposing a constitutional amendment to reverse Supreme Court's immunity decision
  • I appreciate one of the most concise explanations of that perspective I've ever read! This is actually the one I'd like to believe, but not the one I do. I disagree with the idea that "both sides are the same," but I won't go so far as to imagine Democrats are truly concerned with integrity to the degree that they'd sacrifices strategy. I'm afraid they're just people, and people are all fucking stupid in their own way. It's just some are fucking stupid and malicious.

  • No taxation without representation either!
  • The tax situation for U.S. territories is complex and varies between them. Here's a brief overview:

    Puerto Rico and U.S. Virgin Islands:

    Residents generally do not pay federal income tax on local income. They do pay federal payroll taxes (Social Security and Medicare). They have their own tax systems.

    Guam and Northern Mariana Islands:

    Have a "mirror" tax system that parallels the U.S. federal tax system. Residents pay taxes to the territorial government instead of to the U.S. federal government.

    American Samoa:

    Has its own tax system. Residents generally do not pay federal income taxes.

    However, there are exceptions:

    Federal employees in these territories generally pay federal income tax. Residents with income sources from the U.S. mainland may need to pay federal taxes on that income. Some residents may need to file U.S. tax returns under certain circumstances.

    It's important to note that while residents of these territories may not pay federal income taxes in many cases, they also don't have full representation in Congress and can't vote in U.S. presidential elections (though they may participate in primary elections).

  • Mildred
  • Mildreds out here, tryin ta take 'Mildred' back, and y'all throwin shade like people ain't have to live with dumbass fuckin names like Mildred. Leave Mildreds alone! They've got enough to deal with. Wait, they chose Mildred, like as an adult? In the present? Pfft.

  • Arctic Circle (OC)
  • Claude, re-write my post.

    <Claude> Sure thing boss!

    Generative AI fascinates me, and I use it frequently for both work and leisure. This photo, however, is the first to make me feel a twinge of sadness, as it reminds me that I now have to question the authenticity of everything I see.

    Then I recall that even traditional photographs often undergo extensive post-processing, which somewhat alleviates my concern. To distract myself, I create an AI-generated video of the river flowing, momentarily forgetting my worries.

    However, I can't help but notice that many of these AI generations are of poor quality. Generative AI is simultaneously wonderful and troubling - a double-edged sword of technology.

    Despite these mixed feelings, I must admit this is a beautiful photo.

  • Arctic Circle (OC)
  • I love generative A.I. I use it all the time, professionally and recreationally. This photo is the first to make me sad that I have to question the veracity of everything now. Then I remember the amount of post-processing a photo like this usually goes through and feel less sad. Then I make an AI video of the river flowing and forget what I was worried about. But all these generations are kinda trash. Such wonderful, terrible things. Beautiful photo, though.

  • All Hail King Biden the First
  • The DNC is not a government entity. It's a private organization that represents the interests of party members and has no legal obligation to serve the public. We can be irritated with the DNC, but we can't blame them for this.

  • Fuck up a book for me please
  • I asked Claude to re-write the beginning of Ulysses at a 5th grade level:

    Buck Mulligan, a big man, came up the stairs. He was carrying a bowl with shaving cream, a mirror, and a razor. He wore a yellow robe that wasn't tied. He held up the bowl and said some words in Latin.

    He stopped and called down the stairs: "Come up, Kinch! Come up, you scared priest!"

    Buck went to the edge of the tower. He turned around and made blessing motions three times towards the tower, the land, and the mountains. Then he saw Stephen Dedalus. Buck made quick crosses in the air and shook his head at Stephen. Stephen didn't like this. He was tired and put his arms on the stairs, looking at Buck's face and hair.

    Buck looked under the mirror quickly and then covered the bowl. "Back to work!" he said seriously. He added in a funny voice: "This is the real thing, friends. Body and soul and blood. Slow music, please. Close your eyes, guys. One moment. A little problem with those white blood cells. Quiet, everyone."

    He whistled long and slow, then waited. His white teeth had some gold in them. Two loud whistles answered back. "Thanks, buddy," he said quickly. "That's good. Turn off the power, will you?" He jumped down and looked at Stephen. His face looked like an important church person from long ago. He smiled. "What a joke!" he said happily. "Your funny name is from ancient Greek!" He pointed at Stephen in a friendly way and went to the edge, laughing. Stephen followed him halfway and sat down, watching as Buck set up his mirror and started to shave.

  • The Future of Large Language Model Pre-training is Federated

    Also See: Worldwide Federated Training Of Language Models

    Claude's Summary:

    The two papers, "Worldwide Federated Training of Language Models" by Iacob et al. and "The Future of Large Language Model Pre-training is Federated" by Sani et al., both propose using federated learning (FL) as a new paradigm for pre-training large language models (LLMs). The main ideas are:

    1. FL allows leveraging more data and compute resources from multiple organizations around the world, while keeping the data decentralized and private. This can enable training larger LLMs on more diverse data compared to centralized training.

    2. FL relaxes synchronization requirements and reduces communication overheads compared to data-parallel distributed training, making it feasible for geographically distributed participants with varying hardware and connectivity.

    3. The papers present systems and algorithms for enabling efficient federated pre-training of LLMs at billion-parameter scales. Key techniques include allowing participants to modulate their amount of local training based on resource constraints, and partially personalizing models to clusters of participants with related data.

    4. Experimental results show federated LLM pre-training can match or exceed centralized training performance, with the performance gap narrowing as model size increases to billions of parameters. Larger federated models also converge faster and are more robust.

    5. Challenges include data and hardware heterogeneity across participants. The papers propose techniques like adaptive aggregation and load balancing to mitigate these issues.

    In summary, the papers argue federated learning is a promising new direction for democratizing LLM pre-training by allowing many more organizations to collaboratively train large models on their combined data and compute resources. Let me know if you would like me to expand on any part of the summary or papers in more detail.

    8