Skip Navigation
Pichu0102 Pichu0102 @kbin.social
Posts 1
Comments 4
www.bbc.com German patient vaccinated against Covid 217 times

Researchers have written up the unusual case in a medical journal.

German patient vaccinated against Covid 217 times
15
Texas "physically barred" Border Patrol agents from trying to rescue migrants who drowned, federal officials say
  • A dystopian story writer would get lambasted for including something so ridiculously, over the top bleak in their story. Yet here we are in reality where...

  • Computer - LOG OUT! (by raph_comic and dogmo comics)
  • I mean more that things people in a simulation can't observe with great detail yet won't be simulated with great detail until they can see that kind of detail, and by then, I assume technological advancements in the host world would have improved hardware to run on that allows that kind of detail to be simulated in a reasonable amount of time. Plus there's also the ability of the host world to edit the simulation so that things that weren't simulated in great detail when observed by people in the simulation before retroactively was changed, so that people inside always were able to see things in great detail in their memory, history, and other forms of knowledge from their points of view, but from the outside, things inside were changed minimally to make them consistent with any retroactive simulation conflicts. Not in a dystopian way, mind you, just in ways like "this very star was actually always a few light years away from its current position in the sky", like small technical details that are smoothed over in the internal history as seen by the simulation inhabitants to match up with other parts of the simulation.

  • Computer - LOG OUT! (by raph_comic and dogmo comics)
  • There are a couple tricks one could use like having some parts of the simulation skip steps in less important areas, simulating different parts at different times in the host world and only syncing them back together when necessary, which would end up being invisible to those inside, as well as the simulation not running in real time, where it might be running slower or inconsistently in the host world, while inside the people see it as stable and not slow.
    Not that I'm claiming it's true; it's simply an interesting thing to think about and ways around processing speed issues. If humanity ever makes a simulation of even a small universe, I imagine some of these tricks that are smoothed over in that universe would be used, since it can look messy from the outside but look normal on the inside.

  • A.I.’s un-learning problem: Researchers say it’s virtually impossible to make an A.I. model ‘forget’ the things it learns from private user data
  • I feel like one way to do this would be to break up models and their training data into mini-models and mini-batches of training data instead of one big model, and also restricting training data to that used with permission as well as public domain sources. For all other cases where a company is required to take down information in a model that their permission to use was revoked or expired, they can identify the relevant training data in the mini batches, remove it, then retrain the corresponding mini model more quickly and efficiently than having to retrain the entire massive model.

    A major problem with this though would be figuring out how to efficiently query multiple mini models and come up with a single response. I'm not sure how you could do that, at least very well...