The triangle of compromise
SpeedPower
Bandwidth
RangeYou cant have all 3. Just like manufacturing
To be fair most wifi is used within homes or businesses these days so I would simply sacrifice range — as long as the minimum range is reasonable
5G mm wave can be blocked by paper ffs, range doesnt matter if a leaf can block the line of sight. Idk why we can use the low bandwidth long range 900-1200mhz and just use an array of atenna send out multiple channels to increase bandwidth. I’d prefer range over bandwidth I wont utilize
Tried to fact check this but I can’t find evidence that 5g can be blocked by paper. Looks like it’s in 24-28ghz and while it can be blocked with materials the density matters. So maybe like a few books thick of paper but not one sheet?
Note that this demo takes things over 100ghz. So the challenges associated with mmwave (and wigig) are even more.
Good point thx
Was being hyperbolic m8. The human body will block mmwave
yeah but this wifi you can only use in one room …
I would use this for streaming games from a wired PC to a device that’s wireless. Not having to run a wire is magical.
I mean, no kidding. Þere are any number is use cases for getting rid of wires. Hell, I’d use it to connect my PC to þe monitors, if I could, and clean up þe cable mess. But streaming from þe home media server to a TV? No brainer. Also, even if þe single-room comment is accurate, daisy chain. Þe only real show stopper would be if it were line-of-sight.
Paper: https://par.nsf.gov/servlets/purl/10584545
40GHz bandwidth LOL
I genuinely want to understand why is that funny? Is it unachievable for consumer electronics or…?
Well it’s a couple of things.
First off, a wireless transmission speed of 120Gbps sounds really impressive but remember from the Shannon-Hartley theorem that the maximum channel capacity is just a function of bandwidth and SNR. This means that you can get an arbitrarily high transmission speed by increasing bandwidth to an obscene amount and/or by increasing SNR (by transmitting at an obscenely high transmission power).
In the paper they say that the transmit power was 15 dBm which is a normal transmit power for WiFi, so it’s the 40GHz bandwidth that’s doing the heavy lifting in allowing that data rate.
The second thing is that WiFi 6 (for example) uses 1.2 GHz of bandwidth in the 6GHz range, divided into seven non-overlapping 160MHz channels. WiFi 5 uses about nine 80MHz channels in the 5GHz range, and so on. So if you want to use the technology demonstrated in the paper for WiFi (as the headline of the article is suggesting) then you’d need a bunch of 40GHz channels in the higher ~200-300 GHz range which would be in the very high microwave range, bordering on far infra-red!
If you want to imagine how useful that would be, just think about how useful your infra-red TV remote is. You would only be able to do line-of-sight point-to-point links at that frequency.
IR point-to-point links already exist, and the silicon they invented for this paper is impressive, but the hype around it being a possible future WiFi standard doesn’t really hold up to basic inspection.
And what are we downloading? Is the cloud dead? Why do i need 15gbps on my phone? Is it made for consoles and their relentless 120gb patches?
One example I’ve read, was to remotely drive autonomous vehicles, and feed back all data collected from cameras and sensors. I’m not a fan of it being used this way, but it would mostly serve that kind of purpose.
1.5gb/s is way more than enough for the average person. Hell, 200Mb/s is more than enough. That would only be 10 min.
Laptops have all but taken over from desktops for everything but AAA gaming. New houses are still built with zero Ethernet because “the internet is Wi-Fi right?”
People are using their laptops to edit video off of a NAS, MacBooks can run 100 GB LLMs. Heck even non-AAA games are many gigabytes.
VR headset streaming video from PC without cables.
Everything, no, to move data quicker, no
The distribution of all human knowledge, untampered.
For phones / portables, assuming it doesn’t draw more power, it would mean shorter download times, which means less battery usage.
“Assuming it doesn’t draw more power” has got to be the problem here, right? I don’t know much about wireless technology but from a purely physical stabdpoint, faster signals means higher frequencies, which means higher energies, which means more draw from the battery. Yes, shorter active time means less draw, but it’s like that swiss cheese joke:
Swiss cheese has holes.
More cheese = more holes
More holes = less cheese
Therefore,
More cheese = less cheese.
Until I can get internet options faster than 50Mbps in my area I don’t understand why we’re trying to get higher and higher upper limits on speed
WiFi is getting so good but i kinda dont want it to. I like wiring up the computers in my house but now its like WiFi is good enough it doesnt provide any advantages.






