What’s the speed? Do you have a shitty 10mbps connection like my parents? Then WiFi, because you’re easily saturating that line either way.
Do you have gigabit? Then Ethernet, but then again getting like 600mbps wirelessly is good enough.
Biggest thing is having GOOD coverage. My house has multiple access points so that my connection is great everywhere. People with a shitty ISP router shoved in the cupboard in their basement make no sense lol.
Do you have a shitty 10mbps connection like my parents? Then WiFi, because you’re easily saturating that line either way.
Only if latency doesn’t matter. WiFi has a lot more jitter, no matter if your WAN connection is 10 or 1000mbps.
Packet loss really, and the latency and jitter said loss can contribute to.
Radio waves go faster (speed of light) than through a medium (copper). Not that it matters at such a small scale, but it’s helpful to have a good picture of the elements at work here. The further you are from the receiving point, the more obstacles (matter) that can obstruct it. But in ideal conditions WiFi is better than most people think. Replicating those ideal conditions though…
Radio waves go faster (speed of light) than through a medium (copper).
Except that copper ethernet is baseband, so it’s not radio waves. WiFi is still faster than copper AFAIK (there was a huge debate about this between youtubers not that long ago), at least for signalling, but the difference is smaller than you think. light (which is EM, the same waves as radio/WiFi) through glass is about 2/3rds c (aka the speed of light), and it’s actually a lot slower than ethernet or WiFi for propagation delay, however, WiFi must use CSMA/CA as well as other tricks to ensure it doesn’t step on itself, and that it doesn’t step on other sources of radio interference (Microwave ovens, wireless controllers (like xbox), bluetooth, zigbee, etc, on 2.4Ghz and stuff like RADAR on 5Ghz). It’s half-duplex, so only one station can transmit at a time, hense CSMA/CA being required, where ethernet doesn’t need any collision avoidance or detection except for rare cases of 10/100 half duplex, all gigabit is full duplex. Half duplex on wireline networks is basically eliminated at this point, so it’s little more than a footnote.
Factoring all this in, getting the signal down the line, WiFi loses in almost every case, due to all the considerations it needs to take into account.
What crap are you doing that so intensive WiFi causes latency? It’s essentially a negligible difference unless you are saturating the signal. We’re taking less than 3ms for a reliable round trip.
There are lots of factors that can cause jitter on WiFi, and it’s mostly outside of your control if you’re living somewhere more densely populated. My apartment randomly gets a lot of noise, and as a result my WiFi starts to get unacceptable amounts of packet loss and jitter. It doesn’t happen often enough to motivate the effort for me to go around signal analyzing, but still…
Wi-Fi has constant retransmissions. This adds perceptible latency because the checksum check, turnaround, and packet transmission add a lot of time compared to the speed of light through air across 3 meters.