Maybe it's time we invent JPUs (json processing units) to equalize the playing field.
284 0 ReplyWell, do you have dedicated JSON hardware?
160 1 ReplyEverybody gangsta still we invent hardware accelerated JSON parsing
117 0 ReplyRender the json as polygons?
102 0 ReplyThat is sometime the issue when your code editor is a disguised web browser 😅
79 2 Replythere are simd accelerated json decoders
55 0 ReplyWould you rather have 100,000 kg of tasty supreme pizza, or 200 kg of steaming manure?
Choose wisely.
44 0 ReplyCPU vs GPU tasks I suppose.
42 1 ReplyI have the same problem with XML too. Notepad++ has a plugin that can format a 50MB XML file in a few seconds. But my current client won't allow plugins installed. So I have to use VS Code, which chokes on anything bigger than what I could do myself manually if I was determined.
36 0 ReplySomeone just needs to make a GPU-accelerated JSON decoder
24 0 ReplyWorks fine in
vim
24 0 ReplyReject MB, embrace MiB.
21 6 ReplyRockstar making GTA online be like: "Computer, here is a 512mb json file please download it from the server and then do nothing with it"
11 0 ReplyYou jest, but i asked for a similar (but much simpler) vector / polygon model, and it generated it.
10 0 ReplyLet it be known that heat death is not the last event in the universe
10 0 ReplyThe obvious solution is parsing jsons with gpus? maybe not...
7 0 ReplyC++ vs JavaScript
6 0 ReplyI literally edited 600mb of json today in VScode on a mac
2 0 ReplyGiven it is a CPU is limiting the parsing of the file, I wonder how a GPU-based editor like Zed would handle it.
Been wanting to test out the editor ever since it was partially open sourced but I am too lazy to get around doing it
8 8 Reply