I hooked up my 2011-vintage Apple Airport Extreme in a room which was experiencing an almost 50% drop in WiFi, despite also having Cox' silver bullet Panoramic, 'whole house' gateway. I get that distance is a for-sure signal killer, as the deficient area is in this scenario the farthest from the gateway. The Extreme is being 'fed' via ethernet into its' Internet WAN port (we had the whole house centrally re-wired with new coax and cat5e after last years flood). A cat5e cable coming out of one of the Extreme ports feeds my Roku Ultra, while also providing a reliable access point for wireless reception in that room.
While using a couple of WiFi searching apps I couldn't help but notice the differences in signal strength when switching from the Airport Extreme channel to 2.4Ghz and then to the 5Ghz band. Again, I'm aware of the relative strengths and weaknesses of 2.4 and 5Ghz but what was really puzzling was the huge difference in download speeds when using online speed testing sites (all less Flash and Java--ookla, Cox' own and fast.com).
A cursory Google search didn't prove all that enlightening though on one site there was a rather technical discussion about "how can a stronger wireless signal have a lower max rate?", which referenced that WiFi signal strength is tricky. The most accurate way to express it is with milliwatts (mW), but you end up with tons of decimal places due to WiFi's super-low transmit power, making it difficult to read. For example, -40 dBm is 0.0001 mW, and the zeros just get more intense the more the signal strength drops.---way more than this feeble brain could comprehend.
I also noticed that in performing a speed test with the Roku (using one of their free channels), the highest download speeds came not from the wired connection but from WiFi.
Would someone care to 'splain any of this to me like I'm 5?
I didn't find the right solution from the internet.References: