Hello everyone,
I found a strange behaviour and I'm not sure whether it is a bug or I don't understand something correctly. If this topic was already discussed somewhere, please provide me with link.
I created a very simple simulation: Two stand still Earths, placed really close to each other, surface to surface a few km away. They fall on each other, melt and create final overheated body. However, final mass (and even temperature and radius) of that body varies significantly according to time step used. I think it is beyond accuracy divergence. For example:
- 10 min/sec, final mass is around 1.65 x Earth.
- 1 min/sec, final mass is around 1.14 x Earth.
- 1 sec/sec, final mass is around 1.03 x Earth (?!)
Automatic tolerance was used in all cases.
It is strange, I believe that final mas should be around 2 x Earth, probably a little less. I also think, that 1 sec/sec should provide best results possible, this looks as if it is opposite. Collisions of other planets work the same.
Also collision with higher time step is much more violent, with many debris fragments flying away, while 1 sec/sec is very calm and every single piece falls back to final body. It looks as if higher time step caused higher body velocity values used in simulation computations. I understand that different time steps should provide more-less the same (or simmilar) results, considering different accuracy errors of course, but anyway, smaller steps should be more accurate.
Please explain, thank you!