Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)IR
i_am_not_a_robot @discuss.tchncs.de
Posts 0
Comments 129
Cloudflare's recent blog regarding polyfill shows that Cloudflare never authorized Polyfill to use their name in their product
  • Built bundles are not affected. The service is supposed to figure out which polyfills are required by a particular browser and serve different scripts. Because it's serving different scripts, the scripts cannot be bundled or secured using SRI. That would defeat the purpose of the service.

  • Cloudflare's recent blog regarding polyfill shows that Cloudflare never authorized Polyfill to use their name in their product
  • Code pulled from GitHub or NPM can be audited and it behaves consistently after it has been copied. If the code has a high reputation and gets incorporated into bundles, the code in the bundles doesn't change. If the project becomes malicious, only recently created bundles are affected. This code is pulled from polyfill.io every time somebody visits the page and recently polyfill.io has been hijacked to sometimes send malicious code instead. Websites that have been up for years can be affected by this.

  • Should I stick with Docker Swarm for self-hosting?
  • Docker Swarm encryption doesn't work for your use case. The documentation says that the secret is stored encrypted but can be decrypted by the swarm manager nodes and nodes running services that use the service, which both apply to your single node. If you're not having to unlock Docker Compose on startup, that means that the encrypted value and the decryption key live next to each other on the same computer and anyone who has access to the encrypted secrets can also decrypt them.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • I looked it up before posting. It's illegal in 48 states, including California where most of these companies are headquartered, and every state where major cloud data centers are located. This makes it effectively illegal by state laws, which is the worst kind of illegal in the United States when operating a service at a national level because every state will have slightly different laws. No company is going to establish a system that allows users in the two remaining states to exchange revenge porn with each other except maybe a website established solely for that purpose. Certainly Snapchat would not.

    I've noticed recently there are many reactionary laws to make illegal specific things that are already illegal or should already be illegal because of a more general law. We'd be much better off with a federal standardization of revenge porn laws than a federal law that specifically outlaws essentially the same thing but only when a specific technology is involved.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Web services and AI in general are completely different things. Web services that generate AI content want to avoid scandals so they're constantly blocking things that may be in some situations inappropriate, to the point where those services are incapable of performing a great many legitimate tasks.

    Somebody running their own image generator on their own computer using the same technology is limited only by their own morals. They can train the generator on content that public services would not, and they are not constrained by prompt or output filters.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • Modern AI is not capable of this. The accuracy for detecting nsfw content is not good, and they are completely incapable of detecting when nsfw content is allowable because they have no morals and they don't understand anything about people or situations besides appearance.

  • Girl, 15, speaks out after classmate made deepfake nudes of her and posted online
  • “This kid who is not getting any kind of real consequence other than a little bit of probation, and then when he’s 18, his record will be expunged, and he’ll go on with life, and no one will ever really know what happened,” McAdams told CNN.

    “If [this law] had been in place at that point, those pictures would have been taken down within 48 hours, and he could be looking at three years in jail...so he would get a punishment for what he actually did,” McAdams told CNN.

    There's a reason kids are tried as kids and their records are expunged when they become adults. Undoing that will just ruin lives without lessening occurrences.

    “It’s still so scary as these images are off Snapchat, but that does not mean that they are not on students’ phones, and every day I’ve had to live with the fear of these photos getting brought up resurfacing,” Berry said. “By this bill getting passed, I will no longer have to live in fear knowing that whoever does bring these images up will be punished.”

    This week, Republican Senator Ted Cruz, Democratic Senator Amy Klobuchar and several colleagues co-sponsored a bill that would require social media companies to take down deep-fake pornography within two days of getting a report.

    “[The bill] puts a legal obligation on the big tech companies to take it down, to remove the images when the victim or the victim's family asks for it,” Cruz said. “Elliston's Mom went to Snapchat over and over and over again, and Snapchat just said, ‘Go jump in a lake.’ They just ignored them for eight months.”

    BS

    It's been possible for decades for people to share embarrassing pictures of you, real or fake, on the internet. Deep fake technology is only really necessary for video.

    Real or fake pornography including unwilling participants (revenge porn) is already illegal and already taken down, and because the girl is underage it's extra illegal.

    Besides the legal aspect, the content described in the article, which may be an exaggeration of the actual content, is clearly in violation of Snapchat's rules and would have been taken down:

    • We prohibit any activity that involves sexual exploitation or abuse of a minor, including sharing child sexual exploitation or abuse imagery, grooming, or sexual extortion (sextortion), or the sexualization of children. We report all identified instances of child sexual exploitation to authorities, including attempts to engage in such conduct. Never post, save, send, forward, distribute, or ask for nude or sexually explicit content involving anyone under the age of 18 (this includes sending or saving such images of yourself).
    • We prohibit promoting, distributing, or sharing pornographic content, as well as commercial activities that relate to pornography or sexual interactions (whether online or offline).
    • We prohibit bullying or harassment of any kind. This extends to all forms of sexual harassment, including sending unwanted sexually explicit, suggestive, or nude images to other users. If someone blocks you, you may not contact them from another Snapchat account.
  • Doorbell kamera solution
  • Aqara sells one that works with HomeKit and should work offline. They say it will get Matter support later, but Home Assistant can use it through HomeKit without having to buy any Apple devices.

  • Spacetop G1 is a $1900 laptop that uses a pair of Augmented Reality glasses as a display - Liliputing
  • ChromeOS and ChromiumOS are Linux.

    The problem with ChromeOS (and Android) devices is that hardware support is usually only available in a fork of Linux which gets as little maintenance as possible for the five years. You end up with the choice of running and old kernel that supports the hardware but not some new software, a new kernel that supports new software but the hardware doesn't work right, or taking over maintenance of the fork yourself. The same problem occurs with uncommon hardware on non-ChromeOS devices.

  • Make my IPv6 selfhosted service available on IPv4 network.
  • Be careful with doing this. X-Real-IP and X-Forwarded-For are good for when the client is a trusted proxy, but can be easily faked if you don't whitelist who's allowed to use those headers. Somebody with IPv6 access could send "X-Real-IP: 127.0.0.1" or something and if the server believes it then you'll see 127.0.0.1 in logs and depending on what you're running the user may gain special permissions.

    Also be careful with the opposite problem. If your server doesn't trust the proxy, it will show the VPS IP in logs, and if you're running something like fail2ban you'll end up blocking your VPS and then nobody will be able to connect over IPv4.

  • VLAN question
  • If all you want is to break out the VLANs to NICs using a Linux PC instead of a managed switch, create six bridge interfaces and put in each bridge the VLAN interface and the NIC.

  • Is it safe to open a forgejo git ssh port in my router?
  • There's a lot of wrong advice about this subject on this post. Forgejo, and any other Git forge server, have a completely different security model than regular SSH. All authenticated users run with the same PID and are restricted to accessing Git commands. It uses the secure shell protocol but it is not a shell. The threat model is different. Anybody can sign up for a GitHub or Codeberg account and they will be granted SSH access, but that access only allows them to push and pull Git data according to their account permissions.

  • Traefik conditional certificate for same URL
  • That sounds like Cloudflare is giving you certificates intended only to be used for talking to Cloudflare.

    You might be able to do it if Cloudflare sends a different SNI. It's probably better if you get real certificates from Let's Encrypt and just use those.

  • Stack Overflow bans users en masse for rebelling against OpenAI partnership — users banned for deleting answers to prevent them being used to train ChatGPT
  • That's not what I mean. When you contribute content to Stack Exchange, it is licensed CC BY-SA. There are websites that scrape this content and rehost it, or at least there used to be. I've had a problem before where all the search results were unanswered Stack Overflow posts or copies of those posts on different sites. Maybe similar to Reddit they restricted access to the data so they could sell it to AI companies.