For the moment they have gone away from sacrificial chunks as a method, but I am sure they will have to have some form that implements similar or come back to it.
I seem to remember reading that a new vault will be told to store data in all its space with specific data and through crypto challenge where the vault has to retrieve some/all and create a hash in a certain way to prove it has that amount of space. This at least shows the vault initially can store that amount and isn’t overstating things. This would be important to determine available space. The the vault is penalised as normal if it cannot store that amount. Remember vault is not paid according to reported size but only when it retrieves chunks.
Of course you can adjust but it probably requires a vault reset and lose some of its node “age”. In effect this gives what you suggest (maybe longer). This works very much against quick changes.
Then again you could simply run up a second vault when you want to increase space significantly instead of increasing the first vault. Potentially you could earn more too.
But if your vault is not anywhere full then increasing space is not going to benefit you till the original space is used up.
I agree that sacrificial chunks was an easy way. I guess they are trying to get away from network traffic/work where extra chunks are stored and moving away from it by going to a vault challenge to determine maximum space in the vault. That way a simple command to the vault has it fill up the vault with data calculated from the command then get the vault to run a crypto algo over the data and return a result. This way the total usable space of the vault can be proved. And if later found false then that vault is deemed bad and not used.
NOW the question is “What will be used in the end?” I’d say sacrificial chunks has a very good chance of “winning” out
EDIT 2: I cannot find anything about what I said above so feel free to take it as fantasy. The RFC still says sacrificial chunks and I did find from last year (old) a comment by David about using data chains for this but cannot see how this would actually solve the free space determination.