Hey everyone! I'm relatively new to programming, so I'd appreciate your patience. I have two integers: *minuteOfDay* and *dayCount*. The *minuteOfDay* increases every second and resets to zero after reaching 1440, which then increments the *dayCount*. I use these variables to calculate others, like *hourOfDayCount* and *yearCount* based on these two.
Over time, the *dayCount* will grow tremendously as players might play for a very long time—potentially thousands of years. My question is, is there anything to be concerned about when using this continuously increasing integer, especially regarding accuracy? Do integers lose precision like floats when they get very large? Should I consider switching my logic to prevent any potential issues that could arise? Thanks for any insights!
1 Answer
Ints won’t lose precision as they grow; they either hit a maximum value or not. Just double-check your day counting logic. If it skips from 1438 to 1, you could face issues. You might want to think about using a single counter for minutes if that's feasible, which should simplify things. This would mean less overhead and fewer failure points. For standard game minutes, a 32-bit int can hold around 8000 years if you'll only update every real minute or 136 years if it's every second. But honestly, if you want peace of mind, a 64-bit integer would give you a massive range.
So, are you suggesting I ditch the *dayCount* and just use *minuteOfDay*? That seems like it could really cut down on potential issues. But I’m concerned that it might complicate my updates since I currently do them just once a day.