Skip Navigation
InitialsDiceBearhttps://github.com/dicebear/dicebearhttps://creativecommons.org/publicdomain/zero/1.0/„Initials” (https://github.com/dicebear/dicebear) by „DiceBear”, licensed under „CC0 1.0” (https://creativecommons.org/publicdomain/zero/1.0/)YG
yggdar @lemmy.world
Posts 0
Comments 60
Las Vegas' dystopia-sphere, powered by 150 Nvidia GPUs and drawing up to 28,000,000 watts, is both a testament to the hubris of humanity and an admittedly impressive technical feat | PC Gamer
  • They say there are 16 screens inside, each with a 16k resolution. Such a screen would have 16x as many pixels as a 4k screen. The GPUs power those as well.

    For the number of GPUs it appears to make sense. 150 GPUs for the equivalent of about 256 4k screens means each GPU handles +-2 4k screens. That doesn't sound like a lot, but it could make sense.

    The power draw of 28 MW still seems ridiculous to me though. They claim about 45 kW for the GPUs, which leaves 27955 kW for everything else. Even if we assume the screens are stupid and use 1 kw per 4k segment, that only accounts for 256 kW, leaving 27699 kW. Where the fuck does all that energy go?! Am I missing something?

  • Tolkien wasn't always the most creative with names.
  • He also called them mûmakil in elvish. In my mind, when the Hobbits call them oliphaunts it is because a long time ago someone talked about elephants, and over the years the correct pronunciation was lost.

  • Is it truly unspeakable horrors or is it lazy writing that just lets the mind of the reader do the imagining of unspeakable horrors for you?
  • Yeah that's what I thought too. The horrors are described well, they just typically don't get described through their physical form. As you say, because the human mind cannot comprehend it. There is a lot more focus on impressions, comparisons, and effects, rather than on a real physical description. Personally I thought it was quite neat!

  • Rule
  • AI is a field of research in computer science, and LLM are definitely part of that field. In that sense, LLM are AI. On the other hand, you're right that there is definitely no real intelligence in an LLM.

  • A Windows XP machine's life expectancy in 2024 seems to be about 10 minutes before even just an idle net connection renders it a trojan-riddled zombie PC
  • Yeah a zero-day would be very unlikely, but a months-old, publically known and patched vulnerability could always be attempted. One of the reasons why the hypervisor should definitely be kept up-to-date. There is always someone who forgets to patch their software, why not give it a try? We're talking about a Windows XP scenario after all!

  • Woman Stuck in Tesla For 40 Minutes With 115 Degrees Temperature During Vehicle Update
  • According to the article she did update the car before. She just used to have it done at night, and this is the first time she was in the car during an update.

    40 minutes is a hell of a long time for a software update though.

  • A Windows XP machine's life expectancy in 2024 seems to be about 10 minutes before even just an idle net connection renders it a trojan-riddled zombie PC
  • It's pure speculation, but I assume you'll need

    1. Enough access to the guest OS so that you can interact directly with the virtual hardware. That would probably require root access, so you'll probably need to exploit some bug in the guest OS to get there.
    2. To break out of the vm, you'll then need to exploit a bug in the virtual hardware. You would want to get the hypervisor to execute arbitrary code.
    3. If you want to infect the host OS, then you'll need sufficient access on the host. If the hypervisor doesn't run with sufficient privileges, you'll have to exploit a bug in the host as well to perform a privilege escalation. But I'm guessing the hypervisor will usually have sufficient privileges, so exploiting the host is probably not necessary.

    Sounds like quite a bit of work, but I don't see why malware couldn't automate it. An up-to-date hypervisor should help reduce the risk though.

  • Why aren't more people creating new operating systems, considering that macOS, Windows, and Linux were developed by individuals with computer science and programming skills?
  • It would also be very hard to compete with products that are this mature. Linux, Windows, and macOS have been under development for a long time, with a lot of people. If you create a new OS, people will inevitably compare your new immature product with those mature products. If you had the same resources and time, then maybe your new OS would beat them, but you don't. So at launch you will have less optimizations, features, security audits, compatibility, etc., and few people would actually consider using your OS.

  • 1% of users are responsible for 88% of data loss events
  • I find it strange that they specifically report on that one statistic. Of course most data loss events will be caused by very few people, because data loss events themselves are quite uncommon. They don't say it in the article, but I suspect most of these people only caused one data loss event. If the same people cause many data loss events, that would be worthwhile to publish.

  • What if I'm not very convincing?
  • The function should be cubic, so you should be able to write it in the form "f(x) = ax^3 + bx^2 + cx + d". You could work out the entire thing to put it in that form, but you don't need to.

    Since there are no weird operations, roots, divisions by x, or anything like that, you can just count how many times x might get multiplied with itself. At the top of each division, there are 3 terms with x, so you can quite easily see that the maximum will be x^3.

    It's useful to know what the values x_i and x_y are though. They describe the 3 points through which the function should go: (x_1, y_1) to (x_3, y_3).

    That also makes the second part of the statement ready to check. Take (x_1, y_1) for example. You want to be sure that f(x_1) = y_1. If you replace all of the "x" in the formula by x_1, you'll see that everything starts cancelling each other out. Eventually you'll get "1 * y_1 + 0 * y_2 + 0 * y_3", thus f(x_1) is indeed y_1.

    They could have explained this a bit better in the book, it also took me a little while to figure it out.

  • Me after I got fired
  • That is true, but from a human perspective it can still seem non-deterministic! The behaviour of the program as a whole will be deterministic, if all inputs are always the same, in the same order, and without multithreading. On the other hand, a specific function call that is executed multiple times with the same input may occasionally give a different result.

    Most programs also have input that changes between executions. Hence you may get the same input record, but at a different place in the execution. Thus you can get a different result for the same record as well.