I'm curious if there's a JSON format that allows you to slice up an object into smaller pieces, distribute them across different nodes, and then reassemble them back into the original JSON object. Is that even possible?
4 Answers
There's no standard format that does exactly what you're asking, but you might want to look into alternative storage systems like DynamoDB. It lets you store multiple JSON files linked by primary and secondary keys, so you can access related data in chunks. Just something to consider!
Have you thought about using GraphQL for this? It allows you to define how data is retrieved and could fit your needs for a more relational approach in JSON.
Consider using JSON lines (JSONL). It's more of a handling technique though. It’s efficient for large datasets, but may struggle with very large entries. You could also think about blockchaining for managing those larger pieces.
Right, JSONL is a good start, but I want to ensure my entries don't get too massive without a better solution.
I think the concept you're after is somewhat similar to the map-reduce technique. If each JSON piece has a unique ID, you could combine them during a reduction step. But I’m not sure if there’s an existing distributed JSON standard out there. Maybe it’s a niche use case?
Yeah, I get that! I didn't want to reinvent the wheel. I've noticed that large datasets are often in JSONL format. Just looking for something more streamlined.
That's an interesting angle! I'll look into it. Thanks!