The ones with the fastest query performance, has slow performance for updating indexes.
I haven’t investigated whether or not any of these would work well on SAFE.
The constraints on the SAFE Network is a bit different than the typical scenario. If the index is large, it has to be split into many parts and the relevant parts downloaded to the client before the query is executed on the client side. Normally joining parts of a large distributed index is done on servers with high bandwidth, low latency connections between servers in a single data center.
With an index for up to maybe a few thousand pages there shouldn’t be much of an issue, but once you get over that the index has to be split up in some way.
A good way to make experiment with creating a large index would be to created an index for Wikipedia.
Each time you fetch a part of the index, you have to make the network lookup an NRS or XOR address and fetch the data there. This introduces latency,so you don’t want to have to do this too many times. The part of the index that is to be queried also have to be downloaded to the client, so you don’t want that to be too big either.
There have also been proposals for search by using semantic hashing, search forum for semantic hashing and you’ll find some threads on that.