Some really informative posts on this thread. A few things I'll comment on. Keep in mind I'm an aviation and radio enthusiast and not a professional avionics designer. I realize I am over simplifying things but feel the general concepts should be highlighted. Apologies in advance for anything I got wrong.
iamlucky13 wrote:In this case, on one hand, we have radar altimeter receivers that have to be sensitive enough that they can detect the non-directional reflected signal from a 1/2 W transmitter.
On the other hand we have cell towers broadcasting in the range of 100W, commonly with a directional beam, so RF level is even more intense, and being received line-of-sight, not reflected. This signal is both transmitted at far higher power, and it experiences far lower attenuation.
Again, this is outside my expertise, but it sounds to me like the huge difference in power levels likely makes faint, stray out-of-band transmission more significant, and weak out-of-band reception more critical. I'd appreciate correction from any electrical engineers, especially if they have RF experience, if I'm off-base here.
Thanks for your very informative post.
The key part of your first sentence is 'reflected' since a RADALT reflects its signal off the ground. Typically, earth is not a great reflector. Some times you get lucky and you're bouncing off a water or something metallic, but most of the time it's vegetation, which is not a good conductor of electricity.
https://en.wikipedia.org/wiki/S_meter has a chart showing the actual receive voltages seen in typical radio applications. The signal voltages are measured in millionths of a volt. In terms of power,
https://en.wikipedia.org/wiki/DBm shows the signal powers are measured in picowatts i.e. trillionth of a watt.
https://www.britannica.com/technology/radar/Pulse-radar also suggests radar return signals are at the picowatt level. So the output of a RADALT is typically in the half watt level but the return is in the trillionth of a watt level.
The faa.gov/5g chart published in this thread says the US 5G transmit power levels are in the 1.5 kilowatt range. At the risk of over-simplifying things, I don't feel it's hard to see how a 1500 watt direct path signal can override a 0.5 watt signal reflected off the earth. The receivers are designed to block out-of-band signals, but as other posters have stated, it's not a simple problem to solve. The more filtering you do in the front end the more signal you lose and the more distortions you introduce. There is lots of 'prior art' to look at, but each solution ends up being quite unique.
I had a friend who worked on military radars and their problems were often sorting out tiny variations in how various parts were made. They learned an awful lot about screening parts as they arrived before they put them into inventory as spares or for new builds. They still ended up on firefighting drill more often than you would think.
Snowfalcon wrote:Good points. A few brief comments about the 5G cell towers from someone formerly into spectrum management.
Although the maximum power may be 100W, the system's power control functions always reduce the power on any individual handset connection to the minimum needed to keep the connection alive with an acceptable bit error rate. Which typically is milliwatts or a few watts. Full power is typically used only briefly at connection establishment or when the handset is at the far edge of a large cell. This is a basic function in order to keep the same-channel interference down within the system (i.e. at the next same-channel base station/cell tower).
5G introduces a new "NR" radio interface which includes directional beam forming. This may be a double edged sword, if you think of an approaching aircraft where passengers try to check in to their networks and thereby make beams point towards the aircraft and its radio altimeter antenna. However, the power control still reduces the beams' power to the minimum required. The upside is that the system can be configured to not point any beams in the direction of approach and departure paths. This should reduce the amount of 5G power reaching radio altimeter antennas. Just my 0.02.
Excellent points. These advanced power management and beam forming techniques reduce the nominal power levels but engineers still need to account for the worst case behavior. It makes for a complicated test envelope.
miegapele wrote:So much drama about impeding doom, and now almost all Airbuses are already approved 24 hours after launch, some Boeing's also are. So, why this was not done year or two ago? FAA continues to look like a joke.
In other words, brinksmanship worked. Necessity is the mother of invention.
It's been pointed out a few times in this thread that the avionics vendors were reluctant to share data to industry groups that were trying to sort things out in advance. Seems getting your product grounded by the regulator changes the terms and conditions on what they will share.
kalvado wrote:My impression is that, as it is typical these days, FAA set AMOC bar at a tripping hazard level. No information on what exactly is approved in terms of conditions. Are all previously authorized weather within existing minimums - airport (with reduced coverage areas?) - airplane with AMOCed equipment combos still authorized? Could be that approvals are conditional, on "even if it doesn't work, it is not a big deal" basis.
There is basically a total lack of information - both legal and technical, which is unfortunately typical for industry these days. I am not sure what is being hidden - actual trade secrets or lack of competence.
Indeed, there is a feel of "don't look behind the curtain" to all of this.
kalvado wrote:You're certainly right, more power from the other guy makes it more difficult to deal with. However, keep in mind - low power Bluetooth and microwaves often coexist in the same kitchen, and that is a much higher threshold to clear. Same for GPS receivers in cell environment - GPS signals are way weaker than anything in radalt world, while frequencies are pretty packed as well.
Well said. As I said above, I probably am oversimplifying. Receivers already have a hard job. Nearby high power transmitters make it harder.
kalvado wrote:One take-home message from the pdf - many devices do not have crystal oscillators as stable frequency sources, those few which have do not bother to stabilize them. That alone dates designs into a stone age.
Right, and when you are working at gigahertz levels (billions of cycles per second), stability needs to be in parts per billion to stay within the band. Most people I know working in this space use temperature compensated crystal oscillators, often using 'GPS disciplining' to get improved stability. Radio hackers like to get temperature compensated crystal oscillators to make GPS disciplined oscillators out of retired previous generation cell tower equipment.
Boeing12345 wrote:This is first time in my career that am AMOC has an expiration date. These AMOC's (total of 12) expire on Jan-31 and do not list the same approved airport/runways. So airframe models operating with mixed altimeters have to monitor this incredibly close.
This is pretty discouraging. It makes it seem like they just kicked the can down the road yet again. Note how the FAA 'Newsroom' pieces don't mention the expiration dates at all.
Boeing12345 wrote:The Airbus with ERT-530 (as one example) did not receive the same relief as other altimeters from an airport/runway standpoint. Some models have different altimeters installed as used/mixed aircraft are bought and inducted. As an example Airbus 319 airframe 1234 is approved for different airport/runway as compared to 319 airframe 4567 because of brand/model installed. Imagine working in dispatch or flight ops and dealing with this?
Indeed, a big juggling act to follow.