I have never seen anywhere in the documentation that explicitly indicates what this option does. Nevertheless, I am quite sure that this is the last one, i.e. He builds trees based on 1/9 negative samples. Although both should have about the same effect if the data is good, accepting a subset of negatives is an agreement for modeling, since it makes cross-validation easier, since you now have 9 training sets that you can test against each other.
As a side note, I would not assume that the 90/10 division is so unbalanced. This is much better than in many situations, and there is a discussion about whether rebalancing always helps.
source
share