728 points | by impish92082 weeks ago
Australia's communications regulator, ACMA, has already permitted Wi-Fi 6E devices to operate in the lower 6 GHz band (5925–6425 MHz) under the Low Interference Potential Devices (LIPD) Class Licence. This includes low-power indoor (LPI) and very low power (VLP) devices.
As for the upper 6 GHz band (6425–7125 MHz), ACMA is still evaluating its use. In June 2024, it sought public input on possible applications, including RLANs and wide-area wireless broadband services.
So, while unlicensed device operations are allowed in the lower 6 GHz band, the upper band is still under consideration.
"but" ???
This change opens up 1200 MHz of bandwidth between 5.925 and 7.125 GHz.
> quite a lot of spectrum real estate
Amateur radio is scattered all over the place, but excluding radio satellite they are mostly bellow 300 MHz... ignoring the fact that they are tiny slices, the upper limit of bandwidth you can hope to gain under that frequency is 300 MHz (for all of it), and considering that most of that is not amateur radio, you are going to be gaining a negligible amount of bandwidth that cannot be practically used for a single application because it is not contiguous.
The higher the frequency the more bandwidth is available. For high throughput applications reclaiming these relatively low frequency bands is not useful.
The 70cm band (420-450MHz US) is heavily used. I'm sure cellular services would love it. On the other hand, it is a secondary allocation with other users (e.g. military radars) having priority.
The 23cm band is another secondary allocation, from 1240MHz to 1300MHz-- wide enough for 3 wifi channels. On the other hand, you'd have to kick out the radiolocation service, and it's not contiguous with a big block of channels to make it worthwhile.
Then above that amateur shares frequencies with some of wifi and then microwave frequencies that are so high that they are undesirable.
Wifi isn't necessarily a minimum of 20MHz. Wifi HaLow goes as low as 1 MHz.
The chart is logarithmically stacked, i.e. each row would fit into the one below 10 times, 100 times two rows down etc.
It is only valuable in the sense that it is a very limited resource.
That's exactly the point.
The small amateur radio spectrum allocations cover long-wave emissions that can communicate around the planet and short-wave emissions that engage local repeater networks.
Think of it as an insurance policy - communications backup when comm is a life & death matter. Doesn't happen often, but really important when it's needed.
[0]: https://www.arrl.org/amateur-radio-emergency-communication
Let us have some so far neutral zone. Wi-Fi etc does not need those. I am not a radio guy. But let them have it.
Amateur radio is by its nature more decentralized. Even if you're using a higher frwquency and dependent on repeaters, they tend to be ran by individuals and independent groups, so you can probably find some way to get your signal through.
As well as that those bands are already heavily used already - it would make no sense to open these bands up to WiFi.
Disasters warrant keeping the band for basic news and reporting if nothing else.
FM already has improved audio. Perhaps the same looser regs would bring more people in.
I have often demonstrated that with a thought experiment where there are two stations, one high quality but boring, the other crappy quality, but compelling.
Which one do most people listen to?
But, there is more to it than that. Higher fidelity opens the door for better AD rates and a broader array of appropriate programs.
A smaller number of stations = more per station, very generally speaking.
Channels 70 to 83 to 1G cellular in 1983.
Channels 52 to 69 to 4G cellular in 2008.
Channels 38 to 51 to 4G/5G cellular in 2017.
The current allocation is channels 2 through 36. Channel 37 is not used.
Digital television stations state what "channel" they are in their signal's meta data. That allows them to change frequencies but keep their channel identity. Since TV when digital, many stations have changed frequencies, some several times. You may find the "repacking" of the broadcast TV frequencies an interesting read:
https://en.wikipedia.org/wiki/2016_United_States_wireless_sp...
Home shopping is usually used to monetize excess bandwidth.
[1] - https://www.ecfr.gov/current/title-47/chapter-I/subchapter-C...
Are enough people really willing to pay for the convenience of, I guess, not having to switch between antenna and cable input, or are living outside of broadcast coverage of the stations they care about?
Weirdly, it's exactly the opposite in Germany: Supposedly the public broadcasters have to pay the cable companies to get them to carry their programs.
That's what I've long suspected. No wonder it's a great opportunity to save/waste money :)
Supposedly in some social classes and age groups, broadcast TV is literally unheard of, with Best Buy promoting TV antennas accordingly ("free cable!") and people suspecting it's a scam or illegal.
broadcast is kinda uniquely positioned.
Broadcast TV still goes over the air, though it's digital now.
My other guess is the major uses of this will turn out to be UWB related: https://en.m.wikipedia.org/wiki/Ultra-wideband Which in practice is largely about short range location finding.
I also believe you are correct in that the bulk of the use of the 6 GHz band will be UWB related and folks will exploit the multi-GSPS ADCs and DACs that are on Xilinx's RFSOC and Analog Devices is shipping. I read a pitch for a UWB "HD video extender" which was basically connecting a 4K display over UWB to a source rather than via a cable. That idea became a lot more viable with the current FCC order.
Whether or not people are paranoid, if the FCC moves all unlicenced frequencies into the GHz range, they limit the public's ability to communicate over long ranges with unlicensed equipment.
Today's WiFi, that you all know and love, started out on unlicensed RF (radio frequency) bands. We need to continue to expand the ability to talk on RF to allow innovation, like what happened with WiFi.
Personally, I don't like there being more licensed spectrum. I think more spectrum should be unlicensed and therefore free for all who play by the rules.
You got Zigbee and LoRaWAN for that already. IKEA has water leakage detectors (which I highly recommend, they saved my ass already), and temp/hum sensors, go for Sonoff's lineup.
But in the case of Wi-Fi specifically, part of the success story of 5 GHz (besides having much more spectrum available than 2.4 GHz and having less noisy legacy applications cluttering it) is the lower maximum EIRP in most parts of it.
This forces everybody to have smaller (and if required more) cells – which is a big win in densely populated areas such as apartment buildings, for example.
Because the cell network is designed around the towers managing resource allocation, instead of phones trying and hoping nobody else was trying at the same time. Doing it this way increases the total capacity of the network by a lot.
So to create a phone mesh network, you would effectively need to create an entire new protocol stack, probably some enhancements to the frontend/PHY for the initial connection establishment (two phones realizing they're in range of each other) and congestion handling. And depending on how you implemented it, it would be a power hog too, since listening for a tower broadcast requires much less juice than announcing your presence to the world and hoping someone is in range.
(I do actually think there is phone-to-phone communications buried somewhere in the standards, but it still requires the tower for coordination)
Apple’s AWDL is hacky and ugly in lots of ways, but has been in market for a decade or more and enables phone to phone. If WiFi forum ever gets WiFi direct 2 off the ground it could be amazing.
But phone to phone is chicken and egg; users aren’t demanding it because there aren’t any killer apps, and there aren’t any killer apps because problems like identity, privacy, resiliency haven’t been solved, and those problems haven’t been solved because users aren’t demanding these apps.
Yet the only recent movement in that area was them cutting down on AirPlay to unknown contacts, reportedly due to governmental pressure.
So unfortunately I believe that there is just no interest of Apple to make any move there, despite being in an excellent position: iMessage would solve most problems of spam, discoverability etc. (they could make it so that you can only message preexisting contacts when offline).
You ever try and have a conversation over APRS text messages without even digipeaters in a crowded urban area? Good luck getting through.
I'm not trying to save on data fees; I'd just love to do low-bandwidth peer to peer messaging with people nearby without any network around.
What if, for example, low-frequency 5G bands were available to such P2P applications as a secondary user, similarly to 5G WiFi and weather radars? If there's a network there, use that; if there isn't, do P2P!
It is however great for a non-contact point to point where you connect the xmitter and receiver by attaching adjoining faces together. Imagine a PCIe card where the edge connector had no electrical contacts it just sits in the slot and the connection is a 60GHz link between the card and the base board. With inductive power transfer you don't need any conductive contacts at all.
Its very cool and sciencey but the tranceivers are stupid expensive and the use case is really pretty limited.
Software defined transceivers exist. Adjustable antennas exist. Poorly shielded electronics that can cause further noise propagation to broadcast out of the transmit side also exist.
You can also change the intended broadcast frequency of some cheap handheld radios using a USB cable and an off-the-shelf antenna.
There is very little in the way of the general public to do something illegal, wittingly or otherwise, in RF.
I'd argue that 'capability' is a naive limiter here as they'd be more likely to do this by accident than on purpose (or ignorance vs malice).
There are tons of illegal/unpermitted/unlicensed broadcasts happening all of the time. They only become an issue when regulators need to enforce rules, usually due to noticed interference.
Bad bonding/grounding is probably the most common cause. RF exists other places too. RF that was meant to be contained in a wire can use these same allocated OTA frequencies because they were never meant to escape that closed system... But do, mostly through poor bonding/grounding.
As you can probably see by now, there is little actually stopping anyone from broadcasting on any particular frequency. Regulators will catch them if they're causing destructive interference, eventually.
But you could potentially use 'illegal' RF for years and never be noticed. Your transmit power/range and your local environment (who else is using the spectrum locally) will dictate that for you more than any allocation rules alone.
The amateur radio scene is a special thing. They share knowledge, experience and more than anything, a culture of informed operation of RF.
I'd encourage anyone interested in operating any RF systems to acquire or at least study enough to acquire an amateur Technician license (US).
I agree, but I think it's important to note that "long range" is the context we're talking about. If you just want to get a quick message out then I fully agree, there's not a ton stopping you, but if you want it be a reliable and/or medium to long-term solution, then the barriers are quite non-trivial. There's also the risk of prosecution, which the regulators are not above if they smell intentionality. If they think you're innocently transmitting they'll just ask you to stop, but if they think you're intentionally and/or openly violating the rules, they can bring some serious legal pain down.
I am so paranoid I think that if the FCC are doing this they are right to, or may at least have a point.
We need to accept the radio spectrum is part of the cyber security profile of the area of the country.
To drop a giant hint, something that listens to one thing in remote areas, transmits the results via another channel, over a series of hops and then out of the country, would be of great interest to some people.
The restrictions are annoying, but my belief is the FCC (and international equivalents) should promote amateur radio licensing by committing to protecting licensed usage in the existing amateur bands, and get more HAMs inside the tent pissing in the outward direction.
1. https://www.cnbc.com/amp/2024/08/06/china-launches-its-rival...
Is preventing this even possible in todays world?
The oldest operating satellite (older than the Voyagers) appeared to fail to due a battery issue, but somehow woke up in a mostly-working state soon after. It was noticed by activitsts in Poland, and was used to bounce radio messages out of the country. Since satellite communication is pretty directional, and into the sky, it was difficult-to-impossible to triangulate the source of the signals.
A current proposal regarding the 900MHz band, primarily put forward by NextNav, suggests a significant reorganization of the spectrum to allocate a portion for their terrestrial 3D positioning network, potentially creating dedicated uplink and downlink bands within the lower 900MHz range, which could impact existing users like toll systems and RFID devices due to potential interference concerns; however, this proposal faces strong opposition from various industries currently utilizing the band
From what i heard from knowledgeable people, not just that but also unspoofable range - when verifiable proximity/direction is desirable eg for security applications. Say you want a car that opens only when your phone is within ~2m range, and not from 1km away with some MITM/amplification device.
Not sure this is used already but it was one of the benefits mentioned to me.
It's an affront to reason. Yeah, let's nuke a massive chunk of amateur spectrum, AND LoraWAN AND Z-wave AND EZPass so that domestic orgs can have a pre-enshittified PNT implementation. Never mind that demand for PNT is driven primarily by orgs operating on foreign soil, where nobody gives a damn what the FCC thinks.
AFAIK wifi can use more power than that, at least in the US: 100mW, possibly 200mW, not sure what the hard limit is (or how much that must be spread).
https://www.ecfr.gov/current/title-47/chapter-I/subchapter-A...
So beamforming under these rules lets you use less input power for the same directional power. But it doesn't let you bundle all your power into a single direction.
I've never seen this happen for a WiFi band operator, so yeah, they're aren't looking. They certainly could though: someone is using all those grey market boosters. Some of those have enough power to show up for many miles, and triangulating them is quite easy.
They do go after cell jammers. One example from 2016 was a guy in Florida using one in his car during his daily commute. People complained their signals failed at about the same time every day and the FCC pursued it and caught him. $48K fine.
That's speculation. I know what is written in the notices and orders. The orders do not cite either complaints or determinations about interference. The notices do. Some of these document complaints with the formula: "in response to complaints." Others do not, such as this[1].
[1] https://docs.fcc.gov/public/attachments/FCC-24-09A1.pdf
My inference is that when actual complaints exist they are documented, for evidentiary reasons. Thus, when the existence of complaints is omitted, no such complaints exist.
But if you run for hours every day, and you run a shitty PA with a too wide bandpass filter, that jams other stations? Oh boy you'll get v&.
One reason regulations like this exist is to save people the burden of having to worry that some asshole will suddenly start transmitting in a way that interferes with you and you’ll have to go to the trouble of complaining. You can proceed on the general assumption that other people will be acting within the agreed limits.
It’s a very selfish attitude to take to say ‘Ah, I’ll just crank up the gain til someone complains’ rather than ‘I guess I’ll stick within the guidance so I don’t inconvenience anyone’.
Once again, I say that you can push your TX higher than "accepted" and for the most part get away with it. If you push it higher to have negative consequences on other people (especially those other hackers that feel slighted) will seek to fix the glitch.
I'm really confused on how this original comment is so lost on here.
In the area of radio, you must be extremely careful with this mindset. Keep in mind that interference may affect important / safety-critical infrastructure (and you may not even realize). For example, I am frequency facing issues from wifi (with deactivated or nonfunctional DFS) towards a weather radar in the same band (which indeed is the primary user, and weather forecasts are critical to aeronautical and maritime safety ! This is a nightmare to troubleshoot).
If you want to tinker with radio, I would suggest to pass your HAM radio license (which will give you the minimum background with regards to radio, propagation, regulations, etc.).
Because an increasing number of people do not care anymore about any regulations, I see many administrations worldwide are increasingly pushing for locked firmwares and bootloaders (i.e. no more openwrt), more import control, etc. If you want to still be able to tinker with hardware and radio in the future, you should always ensure peaceful coexistence with others i.e. know what you are doing and work within the boundaries of regulations (be it txpower, duty-cycle, listen-before-talk, etc. Those limits are there for a good reason !).
People trying to hack their DOCSIS modems and flashing firmware from other countries disrupt everyone on the same trunk cable. People thinking they'll just beef up the PA on their crappy wifi AP to get a better signal instead of adding a second AP cause enough noise to disrupt an entire neighborhood, especially if the PA output fries (part of) the bandpass and now there's side emissions all over the place. People burning wet wood in their ovens because they can't be arsed to dry it properly or, worse, outright burning trash stink up the entire 'hood. And people taking dumps in the rivers were enough of a problem in medieval times already that warnings like "it is forbidden to crap in the river on <day> because the next day beer will be brewed" [1] were commonplace.
[1] https://www.abendblatt.de/vermischtes/journal/article1068323...
I'm not entirely sold that it's the curious hackers of the world driving that trend as opposed to incentives like surveillance and control
Folks were modifying stuff for more power, more "channels" and disabling dfs and apc. So it did not happen for no reason. And now vendors like Ubiquiti and Mikrotik have special USA models with more locked down firmwares.
> some asshole will suddenly start transmitting in a way that interferes with you
remember LightSquared/Ligado Networks? assholes, but with $$$ tho. still a thing:
https://news.ycombinator.com/item?id=23103290 https://www.gps.gov/spectrum/ligado/
WiFi hardware is cost optimized. It’s likely that the PA chips in your radio are going to distort if pushed past the legal limits. Many radios distort heavily past 100mW.
Its common for people to turn the power setting all the way up thinking they’re getting the best performance, but best performance might occur at a lower setting.
The point is to stop scaremongering about letting people turn up the TX. It's just a damn WiFi radio. They aren't going to be blocking their neighbor from listening to their favorite ClearChannel session of commercials. At most their neighbor might get a bit of interference. But if you're running on the same channel as your neighbor, you're already asking for trouble. Nobody's going to emergency, nobody's going to jail. We're not talking about firing up an 100kW flame thrower of a radio signal here. Let's just everyone keep their knickers on and realize the context of what we're discussing
If you increase the TXpower way beyond the P1dB of your PA, you will introduce distorsion and therefore harmonics beyond your operating band (i.e. you can end up disturbing RF services other than wi-fi, possibly safety-critical services). https://www.everythingrf.com/community/what-is-p1db
=> Don't do that !
This isn't true, WiFi handles same-channel interference better than off-channel interference. Especially newer standards have more tools to deal with it like BSS coloring.
Translated the linearity of your PA (nevermind LNA for RX) is unlikely to support even 20dBm, and the higher rate modulations for even 802.11a/g (nevermind any MIMO/SDMA workings) are EVM limited, not received power limited
It's not a really a big poblem for those who know, the limiting PA can be replaced or bypass with with more accommodating ones that can transmit higher power with better linearity and power envelope tracking.
[1] Crest factor:
i don't know why we're being argumentative here
And yes, they have vans roving the streets looking for illegal transmissions.
Just the other day in another thread there was the conversation about mischievous people doing things, and that pretty much sounds like what that guy was doing. He got the best of both worlds knowing he was going to cause some chaos but none of the repercussions of it.
Since the goal of hide and seek is to get home (before found) I’d call that traveler assistance. <grin>
Only and when they receive complaints. Then they have to decide if it is serious enough to care. This is a far cry different than active BBC patrols.
I clearly stated until people notice and complain. It's like these words were totally ignored.
Pg. 95: Very Low Power Device. For the purpose of this subpart, a device that operates in the 5.925-6.425 GHz and 6.525-6.875 GHz bands and has an integrated antenna. These devices do not need to operate under the control of an access point.
Pg. 98: Geofenced Very Low Power Access Point. For the purpose of this subpart, an access point that operates in the 5.925–7.125 GHz band, has an integrated antenna, and uses a geofencing system to determine channel availability at its location.
[0] See page 3 for an example definition of a VLP definition and requirements from earlier this year. It specifies EIRP and how the power has to be distributed so you're not throwing one big spike in the middle of the channel for example.
i was expecting vehicle-to-vehicle communications
In November 2024, the FCC finalized rules of the 5.850-5.925 GHz band, including for Cellular Vehicle-to-Everything (C-V2X) technology, which is considered a successor to DSRC for V2V and vehicle-to-infrastructure (V2I) communications.
V2V had spectrum allocated to it since 1999. But V2V+V2I got sucked into C-V2X which is astounding to me; on the one hand it make sense (5G is good at this sort of thing), but now you have gatekeepers taking their cut to provide the Service. It it were straight V2V, then it would have been free-to-use. It is astounding to me that in 2024, we still do not have the vehicle in front of you sending to your car's computers data that the driver ahead just hit his/her brakes, and you should be prepared to do the same. AEB is fine, but the current attitude seems to be "Battleship My Car" - meaning, collect all the data, make all the decisions in MY car, other cars be damned...or, ignored.
My guess is V2V just presented too many security holes to win widespread adoption. If you could go around spoofing braking events on the highway, that would be super dangerous. But that's just my guess.
As to talking with cars around you, get a ham radio license, and set your HT to 146.52 MHz -- the national simplex calling frequency. The more people we have monitoring 146.52, the better. That frequency, more than any other ham radio frequency, is the nationwide "SOS!" channel. If you have an emergency out of cellphone range, but you have an HT, often 'somebody' will hear you on 146.52 and can call for help. The other common calling frequency is 446.000 but 2 meters tends to have better range through forest terrain; and probably more people listen to "52" than 446.000 --- but try both in an emergency.
Does anyone know if there's a good reason to use EIRP that I'm missing? I figure satellite communication terminals can have huge EIRPs because they're all pointed at the sky, but the FCC can't guarantee that the beams won't cross for other bands, so they limit the EIRP, but I still think we would all be better off of our systems were spatially selective.
So limiting EIRP provides a limit for the interference suffered by a receiver that happens to be in the direction towards which you transmit, for which it does not matter at all which is the total power that you transmit in all directions.
When this is true, it is trivial to use classic directive antennas to achieve very long range communications with standard WiFi. There is no need for expensive phased array antennas.
For mobile stations and/or access points, phased array antennas are not enough. See my other reply.
WiFi, Bluetooth and the other kinds of communication protocols standardized for use in the unlicensed bands are intended mainly for cheap mobile devices, and mobility at a modest price restricts the antennas to be omnidirectional.
EIRP minimizes regulations. It's a good trade-off over operator and installation licencing.
Maybe the EIRP shouldn't be unlimited, but I still think it would be beneficial to encourage spatially selective systems.
There is no justification for imposing additional costs for others in order to accommodate your desires that do not matter for them.
Nobody stops you to use a phased array antenna only to obtain a higher gain for reception, in order to increase the communication range.
Even without phased array antennas, using just classic directive antennas that are placed on high masts, it is possible to communicate through WiFi at tens of km (but only at low bit rates and not in all countries, as some have more severe EIRP limits).
The problem of directive antennas is that they are usable only for fixed positions of access points and wireless stations.
Phased array antennas are not enough to enable mobility, because initially a mobile wireless station must discover the direction of the AP and the AP must discover the direction of the station, by using omnidirectional transmission, which limits the range to what can be achieved without phased array antennas.
To use a mobile wireless network that works at distances greater than possible with omnidirectional antennas requires much more sophisticated equipment than just the phased array antennas. You also need means to determine the coordinates of each station (and of the access points, if they are also mobile) and maps with the locations of the access points so that a station that wants to associate with them will know in what direction to transmit. You also need a protocol different from standard WiFi, e.g. the access point may need to scan periodically all directions in order to allow new associations from distant stations.
But on average, even with unlimited EIRP, I'll see 1/n as many interfering signals that are each n times as strong. That's not a bad tradeoff.
But having a moderate EIRP increase for focused signals would make things better for everyone. Let's say a signal that's 10x as focused can have 3x the EIRP, and everyone switches their equipment over. That drops the total power output by 3.3x, and interference drops significantly for almost everyone.
> initially a mobile wireless station must discover the direction of the AP and the AP must discover the direction of the station, by using omnidirectional transmission, which limits the range to what can be achieved without phased array antennas
You can do discovery at a lower bit rate to get a big range boost.
The only things that matter are the radiant intensity (i.e. power per solid angle) of the interfering transmitter and the percentage of the time when that transmitter is active.
A single interfering transmitter with high radiant intensity (a.k.a. EIRP) will blind the radio receiver for all the time when it is active.
Doing discovery at a low bit rate is fine, but that means that your fancy phased array antenna cannot achieve any higher distance for communication than an omnidirectional antenna, but it can only increase the achieved bit rate at a given distance.
That would be OK, except that it is achieved by interfering with your neighbors, exactly like when using a transmitter with a higher total power than allowed.
Limiting EIRP is the right thing to do in order to limit the interference that you can cause to your neighbors.
The law does not stop you to use a phased array antenna or any other kind of directive antenna, with the purpose of lowering the power consumption of your transmitter, while maintaining the same quality for your communication and the same interference for your neighbors.
What you want to do, i.e. increase the interference for your neighbors, is the wrong thing to desire. If that were allowed, your neighbors would also increase the radiant intensity of their transmitters and then everybody would have worse reception conditions and you would gain nothing.
The hope that only you will increase your radiant intensity and your neighbors will not, is of course illusory.
> The only things that matter are the radiant intensity (i.e. power per solid angle) of the interfering transmitter and the percentage of the time when that transmitter is active.
And if a transmitter isn't pointed at you, then it isn't an interfering transmitter. This is a crucial factor in the math.
Or for a more realistic analysis of directionality, the radiant intensity is only high for a small fraction of observers, and is very low for the rest of them.
In the first scenario I talked about, total interference is probably the same.
In the second, total interference is almost always much less.
> A single interfering transmitter with high radiant intensity (a.k.a. EIRP) will blind the radio receiver for all the time when it is active.
If a moderate boost blinds the receiver, then the alternative is being almost blind for much longer (because there are more interfering transmitters), so I'm not convinced that's a problem.
> Doing discovery at a low bit rate is fine, but that means that your fancy phased array antenna cannot achieve any higher distance for communication than an omnidirectional antenna, but it can only increase the achieved bit rate at a given distance.
I don't understand what you mean.
If you don't care about speed, the maximum distance is the same for both antennas, and is defined by obstacles alone.
You can always slow down to compensate for a lack of gain. And it's a proportional slowdown, not very expensive. Especially when you only need to send a beacon that's a few bytes long to initiate contact.
Just some example numbers: Your functional requirements are 1Mbps of bandwidth with pretty tight focus. You send the few bytes of initial omnidirectional contact at 1Kbps. Your slow omnidirectional signal actually reaches further than your fast focused signal.
> Limiting EIRP is the right thing to do in order to limit the interference that you can cause to your neighbors.
If your only concern is the worst case of everyone being pointed at the same spot, yes. In normal situations the average level of interference matters more.
> The law does not stop you to use a phased array antenna or any other kind of directive antenna, with the purpose of lowering the power consumption of your transmitter, while maintaining the same quality for your communication and the same interference for your neighbors.
The law says that I can maintain quality and decrease interference, but I don't gain any real benefit because I'm only saving half a watt. So I'm not very motivated to do so. I'd prefer if it was legal to split the difference between increased signal quality and somewhat decreased interference.
> What you want to do, i.e. increase the interference for your neighbors, is the wrong thing to desire.
Where do you think I said that?
Edit:
> your neighbors would also
Here, I'll elaborate on a scenario.
Originally, me and my 6 neighbors are all transmitting omnidirectionally and causing 1 unit of interference to each other person. Everyone gets 6 units of interference.
I want to lower my transmit power but double my EIRP. I am entirely selfishly motivated, and just want a better signal to my devices. As a consequence I will now cause 2 units of interference to a single neighbor, and 0.1 units of interference to all other neighbors.
What happens when everyone thinks this way and does the same thing? Well now the neighborhood receives 2.5 units of interference on average instead of 6. Even with a bit of variance as devices move around, everyone is better off now. I love that my neighbors did the same thing I did.
This does not really matter, because in the dense wireless networks that are typical for the unlicensed bands, due to the ubiquitous WiFi and Bluetooth devices, there is always a transmitter pointed at you. Typically there are many transmitters pointed at you.
So any argument based on this idea that there are no transmitters pointed at you would fail badly in practice.
Moreover, the interference in digital communications is not something that grows linearly. It grows in jumps, when some thresholds are exceeded and the error correction used by transmitters fails to work. So at certain thresholds the interference would force your device to reduce the bit rate, until a certain threshold where communication would become impossible until the interfering transmitter stops transmitting.
Also, 6 neighbors do not provide 6 units of interference. The amount of interference depends on many factors. The neighbors that use the same communication channels will attempt to not transmit simultaneously, to avoid collisions.
When a neighbor transmits on the same channel, then it is guaranteed that the interference is so high as to prevent other simultaneous communications. So the interference that we discuss is from transmitters that use other channels than yours.
Besides the fact that the neighbors are partitioned in groups within which only one transmits (but almost all the time there is an active transmitter), the interference depends greatly on the distance to the transmitters.
So to estimate the change in interference when all replace their omnidirectional antennas with directive antennas, increasing the radiant intensity, is far more complex than your simple arithmetic.
The worst case, which can never be excluded, is that there will be at least one transmitter pointed at you and its higher radiant intensity will be enough to cross the threshold at which communication becomes impossible for yourself. In this case it is completely irrelevant if you no longer have interference from other transmitters that are not pointed at you.
Planning wireless networks cannot be done based on hopes that you will be the luckiest in the universe and Murphy's law will not apply to you.
A transmitter, sure. If you go from having 40 transmitters pointing at you, to now having 6 transmitters pointing at you, that makes a big difference. Even if they're running at twice the EIRP now, that's a big improvement.
> So any argument based on this idea that there are no transmitters pointed at you would fail badly in practice.
My argument doesn't depend on that.
> The amount of interference depends on many factors. [...] So to estimate the change in interference when all replace their omnidirectional antennas with directive antennas, increasing the radiant intensity, is far more complex than your simple arithmetic.
Yes I simplified. But does that completely upend the result? If so, show me the math that makes it happen.
> groups within which only one transmits (but almost all the time there is an active transmitter)
> cross the threshold at which communication becomes impossible for yourself
And guess what? If everyone doubles their EIRP but transmits in a much narrower beam, the area in which that happens becomes smaller. The number of transmitter pairs that need to time-share decreases.
> Planning wireless networks cannot be done based on hopes that you will be the luckiest in the universe and Murphy's law will not apply to you.
I think your argument depends on me being lucky in the omnidirectional case but unlucky in the directed transmit case. That's not a reasonable way to assess alternatives.
For every percent chance that higher-EIRP directional transmit causes me problems, there's a bigger chance that higher-total-power omnidirectional transmit causes me problems.
EIRP is a measure of the maximum power in any direction, so a phased array that transmits 1W only forward has the same EIRP as an omnidirectional antenna that transmits 1W forward, but also 1W backward, and up, and down, etc. Overall the omnidirectional antenna may transmit much more power total, but still only 1W in any particular direction.
By definition, any antenna with positive gain must concentrate power in one or more directions. So in practice, omnidirectional antennas are only omnidirectional in the horizontal plain, they have greatly reduced power in the up and down directions (which makes sense as generally there is no need to transmit into the sky or ground).
> Overall the omnidirectional antenna may transmit much more power total
This doesnt make sense - an antenna is a passive device, not an amplifier. Regardless of gain, you cant get more energy out of an antenna than you put into it.
So if you're transmitting 1 W and your antenna has a gain of 30 dBi (1000x), that's equivalent (from the perspective of whatever it's pointed at) to an isotropic antenna (no gain) emitting 1000 W at the same distance.
It should be obvious that EIRP is what matters for interference and human safety, hence why the FCC regulates EIRP instead of power output.
A phased array is a very high-tech method of selective transmission so that you can send a radio signal that is stronger in one specific direction. The way I think of it is that it creates a virtual directional antenna where the direction is adjustable without actually physically moving the antenna elements. The actual tech involved is pretty heavy math and physics (and I don't fully understand it myself) so if you want to really understand it then you may still need to spend hours reading about it.
When whitespace in television bands went unlicensed i don't know how much of that we saw: https://www.fcc.gov/general/white-space
I feel like the barrier may be whether dedicated hardware is required or not. In such a large band, 6 GHz, I would expect a lot of generalized (i.e. non-dedicated) platform hardware to be developed & offered allowing software-focused innovators to offer into the long tail of applications, including mesh network(s).
DECT NR+: A technical dive into non-cellular 5G:
https://devzone.nordicsemi.com/nordic/nordic-blog/b/blog/pos...
0: https://www.espressif.com/en/solutions/low-power-solutions/e...
Sure, you could go with an off-the-shelf solution, but those aren't always a good fit for whatever product you're developing. If you can squeeze your bandwidth requirements to an absolute minimum, say, tens of kilobits, you can accomplish absolutely nutty stuff with a cheap LoRa radio and a well-designed antenna. Sure, it's tens of kilobits, but it's tens of kilobits sent and received over a mile between devices with batteries the size of your pinky.
By that I mean that they're easily blocked, diffracted, whatever.
Cite?
Rather than fight this by trying to shout as loud as you can from a single AP across the house, you can put smaller, weaker APs in multiple rooms. Because of the excellent open air penetration and high frequency, you can get a multi-gig links with no interference or competition.
It increases the number of "VLP" channels.
6GHz has 3 modes of operation:
1) VLP: can now happen in 1200 MHz (5925 MHz to 7125 MHz); previously it was only 850 MHz.
Very Low Power: 25 mW (14 dBm) power.. with -5 dBm/MHz PSD, indoor and outdoor usage.
Think of short range use-cases like smartphone to laptop or smartphone to earbuds/ARVR.
2) LPI: already allowed in full 1200 MHz Low Power Indoor: 1W (30 dBm) power with 5 dBm/MHz PSD (clients are 6 dB lower); only indoor usage.
Think of your home router.
3) SP: allowed in 850 MHz; no plan to expand AFAIK Standard Power: 4W (36 dBm) power with 23 dBm/MHz PSD (clients are 6 dB lower); indoor or outdoor usage.
Requires Aautomated Frequency Coordination; send your location to cloud, cloud tell you which channels area available.
Think of enterprise or high power routers; outdoor point to point links (WISP)
So, this new regulation is only for VLP and will result in more (especially 320 MHz) channels. No change to the most common usage of Wi-Fi (Router to Laptop/PC).6GHz has 3 modes of operation:' 1) VLP: can now happen in 1200 MHz (5925 MHz to 7125 MHz); previously it was only 850 MHz. Very Low Power: 25 mW (14 dBm) power.. with -5 dBm/MHz PSD, indoor and outdoor usage. Think of short range use-cases like smartphone to laptop or smartphone to earbuds/ARVR. 2) LPI: already allowed in full 1200 MHz Low Power Indoor: 1W (30 dBm) power with 5 dBm/MHz PSD (clients are 6 dB lower); only indoor usage. Think of your home router. 3) SP: allowed in 850 MHz; no plan to expand AFAIK Standard Power: 4W (36 dBm) power with 23 dBm/MHz PSD (clients are 6 dB lower); indoor or outdoor usage. Requires Aautomated Frequency Coordination; send your location to cloud, cloud tell you which channels area available. Think of enterprise or high power routers; outdoor point to point links (WISP)
So, this new regulation is only for VLP and will result in more (especially 320 MHz) channels. No change to the most common usage of Wi-Fi (Router to Laptop/PC).
This will allow better channel availability (low latency, higher throughput) for mobile applications in very dense areas..
Spectrum allocation is very weird.
* indoor unobstructed environments
* outdoor point-to-point line-of-sight
It is a holiday miracle for small low power handheld devices.
We'll need to know the ERP limits for these bands before designing any changes.
However, hypothetically even 5W to 8W could open space networks (C band)
Best regards, =3
Is it really? Isn't the signal blocked if a person simply walks between he devices? i.e. if you are wearing a receiver and just turn around you will lose connection.
Best regards =3
Pretty much everything from VHF and up (all of wifi) is line of sight only. Not just the mm-wave stuff. Cell phone networks only work because the telcos pay big money to get their transceivers power/data up on towers high above the terrain.
Store and forward meshes assume the nodes will eventually see each other but down at 5ft above the ground they often won't.
I hope other jurisdictions follow suit so hardware using it can be cheaper due to economies of scale. The segmentation of LoRA radios between US/EU is already pretty annoying and they're fairly niche.
> expand very low power device operations across all 1,200 megahertz of the 6 GHz band alongside other unlicensed and Wi-Fi-enabled devices.
Unless I am missing something, this means Wifi6 currently operates in this range.
1) VLP: can now happen in 1200 MHz (5925 MHz to 7125 MHz); previously it was only 850 MHz. Very Low Power: 25 mW (14 dBm) power.. with -5 dBm/MHz PSD, indoor and outdoor usage. Think of short range use-cases like smartphone to laptop or smartphone to earbuds/ARVR.
2) LPI: already allowed in full 1200 MHz Low Power Indoor: 1W (30 dBm) power with 5 dBm/MHz PSD (clients are 6 dB lower); only indoor usage. Think of your home router.
3) SP: allowed in 850 MHz; no plan to expand AFAIK Standard Power: 4W (36 dBm) power with 23 dBm/MHz PSD (clients are 6 dB lower); indoor or outdoor usage. Requires Aautomated Frequency Coordination; send your location to cloud, cloud tell you which channels area available. Think of enterprise or high power routers; outdoor point to point links (WISP)
So, this new regulation is only for VLP and will result in more (especially 320 MHz) channels. No change to the most common usage of Wi-Fi (Router to Laptop/PC).
I wonder if this was in motion for a while and then intentionally accelerated to ensure it happens under Biden.
Optically it's a pretty pure win. Open stuff sounds good. Less regs sounds good. Tech sounds good. And it's not something that has a corresponding voting block opposing. Just pure upside politically.
Either party would love that.
UV starts at 800,000 GHz.
The 6Ghz being discussed here is completely non-ionizing, not even comparable to UV.
The only concern with 6Ghz is that is can also cause dielectric heating, which is the same as a microwave. But again, at 25mW, you can't even feel the heat from direct contact with the antenna, let alone a few meters away. Your exposure follows the inverse-square law [1], which means that it drops proportional to the square of the distance. So if it's not a problem at 10cm, it's 100x less of a non-problem at 1m.
Eg: you are much less likely to get sunburn if you get plenty of natural (or artificial) infrared.
If nature gave us a flute, and man discovered how to make a bass guitar, all though they sound different the only real difference is that the bass guitar is wiggling air molecules more slowly than a flute would. There is zero, nil, no distinction whatsoever between a "natural" and "synthetic" photon wiggling at a given frequency.
> you are much less likely to get sunburn if you get plenty of natural (or artificial) infrared.
I'm gonna need to see a source for that.
This is categorically untrue.
Polarization: A Key Difference between Man-made and Natural Electromagnetic Fields, in regard to Biological Activity
> All types of man-made EMFs/EMR - in contrast to natural EMFs/EMR - are polarized.
which is so unbelievably wrong, like, completely out of touch with reality levels of incorrect, that I see no value in reading further.
Linear Polarization Antennas in Radio and Wireless Communication Systems
https://resources.system-analysis.cadence.com/blog/msa2021-l...
and
A standard incandescent lightbulb creates about 100W of unpolarized RF from around 400THz to 750THz. It is manmade, it's an RF emitter, it is not polarized, and it's something everyone older than the age of around 10 has spent their entire lives around.
So either the author is completely wrong in sentence number 2, or they're implying that visible light isn't RF. Either way, they're wrong, and you can ignore the rest of their claims.
You are comparing light bulbs to wireless communications? What is your point? He says "All types of man-made EMFs/EMR", not "all types of manmade energy". It is clear he does not think that light bulbs are dangerous. So now you are just being confusing on purpose to muddy the water.
But if you bother to READ the whole article you would see he agrees with you:
"Natural EMR/EMFs (cosmic microwaves, infrared, visible light, ultraviolet, gamma rays) and several forms of artificially triggered electromagnetic emissions (such as from light bulbs with thermal filaments, gas discharge lamps, x-rays, lasers, etc.) are not polarized. "
You know we were talking about EMFs from data communication types of man-made EMFs/EMR"s, but you are being ignorant on purpose, becasue you cannot even read anything that is new and conflicts with your ideas.
Visible light and Wi-Fi are the same physical phenomenon, just at different frequencies.
> several forms of artificially triggered electromagnetic emissions (such as from light bulbs with thermal filaments, gas discharge lamps, x-rays, lasers, etc.) are not polarized.
So, he contradicts himself.
Also:
> Natural EMR/EMFs (cosmic microwaves, infrared, visible light, ultraviolet, gamma rays) [...] are not polarized.
Oh yes they absolutely can be, and frequently are. Polarized sunglasses are specifically made to block the polarized light reflecting off lakes, snow, sand, or other surfaces. Does the author consider light reflecting off a lake to be unnatural, or is it the OK kind of polarized because it's "natural"?
I'm pretty sure their point is that certain frequencies are getting a lot more power than is naturally possible. Not that the photons are special in some way.
But even then, it's impossible to discuss without talking about relative strengths. Wi-Fi transmits at about 100mW at full strength. For math purposes, let's assume it's a point source broadcasting in all directions. (That's not that much of a wild assumption, either.) The surface area of a sphere with a radius of 1m is about 12.5 m^2. On average, then, the Wi-Fi RF strength at 1m away is about 0.008W/m^2.
The sun above us delivers about 1360W/m^2 of RF radiation, or approximately 170,000 times the radiation of standing a meter from a Wi-Fi router. If it's across the room, 4m away, the ratio is closer to 3,000,000:1.
Even if our bodies responded to "man-made" radiation differently than the "good, natural" kind, there's so very little of it relatively that it can't make much of a difference. I mean, ever look at a 100W lightbulb? If Wi-Fi were at 400THz instead of 2.4GHz so that you could see it, it would be one thousandth as bright. There's just not enough power there to do anything meaningful to us.
Unless specific frequency bands cause problems because something very specific is triggered.
Sure, wifi may only be hitting you with 1 milliwatt per square meter. But between 2.4GHz and 2.5GHz the sun only hits you with... if I did the math right, and just accounting for blackbody emissions, around 10 picowatts per square meter.
We're probably fine, but whether it's fine can't be proven with a simple physics calculation that ignores spectrum.
[edit]
Another example of non-ionizing radiation harming human tissue would be if you stick your hand in front of a cutting laser. Maybe obvious, but you asked...
25mW is nothing.
"Abstract: We consider the influence of a terahertz field on the breathing dynamics of double-stranded DNA. We model the spontaneous formation of spatially localized openings of a damped and driven DNA chain, and find that linear instabilities lead to dynamic dimerization, while true local strand separations require a threshold amplitude mechanism. Based on our results we argue that a specific terahertz radiation exposure may significantly affect the natural dynamics of DNA, and thereby influence intricate molecular processes involved in gene expression and DNA replication."
Polarization: A Key Difference between Man-made and Natural Electromagnetic Fields, in regard to Biological Activity
https://www.nature.com/articles/srep14914
Ion Channels seem to be effected by EMFs even at powers that we are subjected to every day.
Becker in his research found that lower power levels could have biological effects, everything else being equal, than higher power levels.
2.4GHz
Eg of research indicating we should at least do more deep research before calling it "Safe": https://pmc.ncbi.nlm.nih.gov/articles/PMC9189734/
So, obviously you don’t want to microwave your eyeballs, but you’d feel that in other nearby tissues as heat. If you don’t feel heat from a non-ionizing RF source, you’re not getting cooked. In any case, the amount of infrared coming off an incandescent lightbulb is about 3 orders of magnitude higher than the energy coming off a WiFi router antenna. If being in the room with a lightbulb is safe, so is being in the room with WiFi.
There isn’t a set of rules of physics where low-power, non-heating, non-ionizing RF is dangerous, and also where CPUs work. They’re incompatible. You can’t have both of those at the same time.
Please elaborate on this? But it sounds like you're overgeneralizing. There's a lot of ways non-ionizing RF could potentially be "dangerous" to some kind of biological tissue, we just haven't found those ways in humans.
For one category of mechanism, there's plenty of proteins that absorb certain wavelengths and activate cellular pathways based on the amount they receive.
So far, the studies that have been well-designed and replicated haven’t consistently nailed down a clear causal link between non-thermal EMF exposure (within the limits that regulators consider safe) and actual health problems. Still, some researchers argue that we’re not accounting for all the slow-burn, cumulative effects that might be happening. It’s not easy to tease out these subtle influences from the noise of environmental variables, and that makes it hard to really say we’ve got a handle on the whole picture. Check out Prof Michael Levin's Bioelectricity work if you want to go down a very interesting rabbit hole about what we're only recently discovering about how our biology might really work and how electricity and emf's shape it.
The ham radio licensing procedure in the US mostly focuses on this effect. Even though there's nothing conclusive I'd imagine there are other deleterious effects that aren't trivially measurable. If it can heat it up it can do other stuff too. Cooking your brain by standing too close to a high power transmission tower can't be good.
I'm an amateur extra, I would challenge any "scientist" laughing EMF dangers off to go find the nearest AM radio tower and spend 6 months in the transmission room for "science".
Without sarcasm, the studies I have found over the years ruled out cumulative effects (unlike ionizing radiation). They so far haven't been able to rule out various types of cancer, ALS, or other diseases caused by long-term exposure.
I am plausibly a "scientist" who has done "science", and I'm not standing next to a AM radio tower for precisely the same reason I wouldn't stand next to a 50,000W light bulb, EMF be darned.
It's not on studies to rule out all the things you mention. The job of studies is to demonstrate that EMF does cause any of them. "EMF is safe" is falsifiable: if you can find one counterexample, it's untrue. And yet, after all the years we've been working with it, other than people who get cooked from sheer power levels, we don't have any proof that it causes those (or any) diseases.
One of the things being pointed to are these EMFs effecting ion channels. The TRPV1 receptor is one of these channels. The TRPV1 receptor is a heat receptor but has many functions. Since this receptor is in the skin 5G and 6G can effect it. The receptor pumps calcium into the cell, and any neurologist will tell you what that can do.