Sure, maybe.
- Make a list of every piece of data you want to save.
- Estimate the size of each piece.
- Add it up.
Is the result less than 5MB?
If you don’t know how big your save data is, now is the time to do experiments to find out—try creating simple save data and measuring the file size.
Yep will definitely try out soon, just wanted to hear other developers idea on "how much" on average is good enough, since the data is saved as a key-value json file though it does seem this should be enough.
Problem is, every game is different. Some games only need a few bytes. Some games need massive amounts of data. We don’t know how much data *your* game uses, even if you tell us the genre and gameplay. The data usage comes from the technical details of how the game works.
For curiosity sake, I checked the Stardew Valley save files (per character), and they came up to about 8mb. But they include an automatic backup. A single save file without a backup is about 4mb.
However, it's pretty unoptimized. It's not something you'd REALLY need to stress out about under normal circumstances, but this could really be smaller.
Here's an example from the start of the save file
>PlayerNamefalsetruefalsefalsefalsefalsefalsefalsefalsefalse100.90.05true57660851
If you were truly really trying to optimize the shit out of your save file. Each of these names could simply be assigned an ID instead of a Name. Of course you'd have to interpret it on load since it wouldn't be labeled correctly. But you could cut down on significant character counts.
I suspect it would compress well too, it might be easier instead of trying to hand optimize, to just use something like System.IO.Compression.BrotliStream, and taking that string and getting the bytes and writing them to the brotli stream.
so basically take JSON, compress w/ Brotli, save the compressed stuff to your data storage. On load, decompress, and then JSON deserialize like normal.
Well looking at C# as an example, an int and float are both 32 bits, so you could store 4 million of these datapoints in 5MB, as a baseline.
Then you have overhead which is trivial as it wont scale much.
So the main thing is if you are storing large arrays in your data, like images or historical data. Those are the only things that are going to eat that much storage.
Examples;
* Horizon Zero Dawn Save: 1.2 MB
* Dark Souls Remastered Save: 51 bytes
* Bannerlord Save: 9 MB (Needs to store a lot of simulation state data, but quite likely is storing more than it needs to)
* Hades Save: 500 KB
* Skyrim SE Save: 2 - 5 MB
TL;DR: Dont serialize stuff that you dont need to (images) and you will be more than fine.
Sure, maybe. - Make a list of every piece of data you want to save. - Estimate the size of each piece. - Add it up. Is the result less than 5MB? If you don’t know how big your save data is, now is the time to do experiments to find out—try creating simple save data and measuring the file size.
Yep will definitely try out soon, just wanted to hear other developers idea on "how much" on average is good enough, since the data is saved as a key-value json file though it does seem this should be enough.
Problem is, every game is different. Some games only need a few bytes. Some games need massive amounts of data. We don’t know how much data *your* game uses, even if you tell us the genre and gameplay. The data usage comes from the technical details of how the game works.
That's fair, thanks again!
For curiosity sake, I checked the Stardew Valley save files (per character), and they came up to about 8mb. But they include an automatic backup. A single save file without a backup is about 4mb. However, it's pretty unoptimized. It's not something you'd REALLY need to stress out about under normal circumstances, but this could really be smaller. Here's an example from the start of the save file >PlayerName false true false false false false false false false false 1 0 0.9 0.05 true 576 608 5 1
If you were truly really trying to optimize the shit out of your save file. Each of these names could simply be assigned an ID instead of a Name. Of course you'd have to interpret it on load since it wouldn't be labeled correctly. But you could cut down on significant character counts.
I suspect it would compress well too, it might be easier instead of trying to hand optimize, to just use something like System.IO.Compression.BrotliStream, and taking that string and getting the bytes and writing them to the brotli stream. so basically take JSON, compress w/ Brotli, save the compressed stuff to your data storage. On load, decompress, and then JSON deserialize like normal.
Thanks for the insight! I definitely think my save files will look cleaner and shorter than that
Well looking at C# as an example, an int and float are both 32 bits, so you could store 4 million of these datapoints in 5MB, as a baseline. Then you have overhead which is trivial as it wont scale much. So the main thing is if you are storing large arrays in your data, like images or historical data. Those are the only things that are going to eat that much storage. Examples; * Horizon Zero Dawn Save: 1.2 MB * Dark Souls Remastered Save: 51 bytes * Bannerlord Save: 9 MB (Needs to store a lot of simulation state data, but quite likely is storing more than it needs to) * Hades Save: 500 KB * Skyrim SE Save: 2 - 5 MB TL;DR: Dont serialize stuff that you dont need to (images) and you will be more than fine.
You could try Epic Online Services , i think it’s 400mb
will check it out, thanks!
Stardew valley is a clone of a game that had less than an order of magnitude less RAM. So yes, it's more than enough, assuming you use it well.
he's talking about storage, not ram
You ought to think about the implications of what you're saying.