You complicate a lot of things. Statistics collection, spider management, html5, XSS storage, interframe communication, virtual host configuration, third-party advertising, interaction with remote APIs, for example google maps.
Not to say that these things cannot be resolved, simply that increasing complexity adds more work and may not provide suitable benefits for compensation.
I have to add that I went this way once for an ad site by adding domains like porshe.site.com, ferrari.site.com, hoping to increase the ranking for these keywords. In the end, I did not see a noticeable improvement, and even worse, Google walked across the site through each subdomain, which means that searching for ferraris can return porsche.site.com/ferraris instead of ferrari.site.com/ferraris. In short, Google believed that every site duplicates, but it still crawls every site every time it visits.
Again, workarounds existed, but I chose simplicity, and I do not regret it.
source share