YMMV is the prenial problem. My parent's house has fairly good coverage at 3 stories and around 4,000sq-ft total from a single Airport extreme. There is an airport express in the basement to cover the one small area that is more or less a dead spot in both bands. That is it. 5GHz of course doesn't cover the whole thing, but 2.4GHz does really well for almost the whole house. New house, large rooms fairly open floor plan.
That said, my 2,500sq-ft single story + basement house has a best mediocre coverage with a single router, so I run a router and an AP to cover the whole thing really well in both 2.4 and 5GHz. Old house, not an open floor plan plus things like a big masonry chimney "blocking" things up in the microwave spectrum.
With some pretty extensive testing from the best to the worst router/APs, the difference in actual coverage is relatively minor. Of course the difference in SPEED at various locations can be pretty large from one to another, even the same number of streams and the same standard (IE 11n). So I guess in a sense that does change how decently things are covered. I did recently move my Archer C8 to the rough center of my house to check and it does actually cover the entire house pretty well on 2.4GHz which my WDR3600 wasn't able to do, with roughly 50% or so better throughput at edges which does push some stuff from marginal to decent performance, but C8+WDR3600 still covers the entire house much better. With the WDR3600 the worst spots would be around 2.5MB/sec with it semi centrally located (which is a CRAP spot to try to locate it for a number of reasons), with the C8 the worst spots are around 4MB/sec, this is with a single stream 11n client running at 40MHz. With the 2 basestation setup spread across the house, the worst spot is 9MB/sec for a single stream client on 11n 40MHz 2.4GHz. Laptop went from the worst spot being about 5MB/sec with the WDR3600 to 8.5MB/sec with the C8, with the two basestation setup it is roughly 16.5MB/sec.