It’s no secret that systems will generally get slower as more data is placed inside of them. Windows and Android, for example, were notorious for slowing down after a few years - although a factory reset will usually resolve the performance problems. Filesystems are also known for slowing down significantly when large amounts of data are placed in them; just look at the disaster that is the Android FUSE filesystem.
My question is how will Anytype’s performance be affected as the amount of data (both size and number of objects) increase? Have any internal tests been conducted to see how much data Anytype can hold before significant speed deterioration occurs?
Specifically, I’m very concerned about the speed at which sets opens will be affected as tens of thousands of objects accumulate inside Anytype. With just ~1000 objects, I experience a two-to-three second waiting time for a set to load the first time after startup. (Subsequent loads are much faster.) How much will the speed deteriorate after many more objects have accumulated, especially when considering that each individual object must be filtered and sorted when a Set is opened?
I’m also very concerned about how Sync will scale with lots of objects - after all, Anytype’s sync isn’t exactly known for being fast.
Another area of concern is about how crust and bloat might pile up in the internal Anytype database. The app is somewhat known for using large amounts of disk space, likely due to it keeping each and every change that has been made to an object. Object removal also isn’t a total guarantee of complete data removal; as seen with deleted Types showing up in the Type selection menu.
I’d really appreciate if any insight can be given into this topic.