RobertS975
Topic Author
Posts: 759
Joined: Sun Aug 14, 2005 2:17 am

Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:11 pm

LHR= EGLL
EDI= EGPH
CDG=LFPG

At least most USA airports keep the same code with a K in front as in KLAX or KPDX.

What is the reason for the 4 letter airport codes and why are they so different from the 3 letter airport codes?
 
navymmw
Posts: 198
Joined: Sun Jun 03, 2007 9:30 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:14 pm

LHR is london Heathrow's IATA code, which is LHR, while EGLL is its ICAO code. (International Civil Aviation Organization)
 
blueflyer
Posts: 3657
Joined: Tue Jan 31, 2006 4:17 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:17 pm

3-letter code = used for reservation and luggage
4-letter code = used for navigation purposes
That's the two-line explanation.
If you'd like more, Wikipedia is actually decent on this.
http://en.wikipedia.org/wiki/Interna...Aviation_Organization_airport_code
Recep Tayyip Erdoğan has no clothes.
 
PSU.DTW.SCE
Posts: 6118
Joined: Mon Jan 28, 2002 11:45 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:31 pm

In the US there are a few that have a different 3 letter IATA code than the IACO lettes following the "K"

KUNV - SCE: State College / University Park, PA
KHXD - HHH: Hilton Head, SC
 
kl911
Posts: 3981
Joined: Mon Jul 21, 2003 1:10 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:40 pm

Quoting blueflyer (Reply 2):

3-letter code = used for reservation and luggage
4-letter code = used for navigation purposes

3-letter code = easy to remember, and supported by A.net . Just hoover the mouse over the code.
4-letter code = Hard to remember, not supported, and hardly used around the world.
 
KELPkid
Posts: 5247
Joined: Wed Nov 02, 2005 5:33 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 5:58 pm

And if you really want to get confused,

Technically, most large US fields have 3 (!) different identifiers. Sometimes, they can all be different.

The first is the ICAO: KXXX

Then, you have the IATA identifier: XXX

Finally, the FAA assigns a 3 character LID (Location ID) to the field (usually used by private pilots for logbook entries). The characters may be numeric or a letter or two or three...there has to be at least one letter in a LID    Usually, the LID and IATA code match up, but there are exceptions...  
Celebrating the birth of KELPkidJR on August 5, 2009 :-)
 
Viscount724
Posts: 19065
Joined: Thu Oct 12, 2006 7:32 pm

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 6:04 pm

Quoting kl911 (Reply 4):
Quoting blueflyer (Reply 2):

3-letter code = used for reservation and luggage
4-letter code = used for navigation purposes

3-letter code = easy to remember, and supported by A.net . Just hoover the mouse over the code.
4-letter code = Hard to remember, not supported, and hardly used around the world.

I agree it's much simpler if the IATA code is used in A.net as most people are far more familiar with them since that's what they use to book flights and what they see on tickets and baggage tags, and most of them decode by mousing over. Many can also be guessed at, for example most people can probably guess that AMS is Amsterdam than EHAM, that Heathrow is LHR than EGLL,or that Narita is NRT than RJAA.

However it's far from true to say that the ICAO codes are "hardly used". They're the codes used for virtually all operational purposes such as flight plans and air traffic control. Pilots and air traffic controllers are no doubt more familiar with ICAO codes than IATA codes.
 
RobertS975
Topic Author
Posts: 759
Joined: Sun Aug 14, 2005 2:17 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 9:29 pm

But why do we need two different code systems? Why can't we simply use the IATA 3 letter codes all the time?

Now, I am an instrument rated commercial pilot (not ATP, not airline) in the US, and every flight plan I have ever filed used 3 letter codes, including cross-border flights to Canada.
 
A342
Posts: 4017
Joined: Sun Jul 31, 2005 11:05 pm

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 9:40 pm

Quoting RobertS975 (Thread starter):
What is the reason for the 4 letter airport codes and why are they so different from the 3 letter airport codes?
Quoting RobertS975 (Reply 7):
But why do we need two different code systems? Why can't we simply use the IATA 3 letter codes all the time?

3-letter IATA codes are usually only given to airports served by commercial airlines.
But there are "only" 26³ = 17576 combinations.
However, there are more airfields than that around the globe, most of which don't have commercial air service. With the 4-letter codes, there are 26^4 = 456976 combinations, which should be more than enough.

Hope that helps a bit.
Exceptions confirm the rule.
 
Goldenshield
Posts: 5008
Joined: Sun Jan 14, 2001 3:45 pm

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 9:50 pm

Quoting KELPkid (Reply 5):
Finally, the FAA assigns a 3 character LID (Location ID) to the field (usually used by private pilots for logbook entries). The characters may be numeric or a letter or two or three...there has to be at least one letter in a LID    Usually, the LID and IATA code match up, but there are exceptions...  

I disagree that they are only used for log book entries. These particular idents are meant to for small airfields who 1) don't have an instrument procedure (There are some exceptions here,) and 2) are not ports of entry, i.e.; they are domestic only, and you would not file to them from outside of the U.S. Also, you could not file IFR to them (Except for those few aforementioned exceptions,) and instead would have to file a composite flight plan (VFR/IFR combo) to reach it.
Two all beef patties, special sauce, lettuce, cheese, pickles, onions on a sesame seed bun.
 
corey07850
Posts: 2335
Joined: Wed Feb 04, 2004 4:33 am

RE: Why Are There Multiple Codes For The Same Airport?

Tue Apr 20, 2010 11:33 pm

Quoting kl911 (Reply 4):
4-letter code = Hard to remember, not supported, and hardly used around the world.

As the other poster said this is far from the truth. In an operational sense, ICAO codes are the standard. Pilots, ATC, navigation databases, etc, etc all use the 4 letter ICAO code. Reason being is a lot of navaids also share the same 3 letter ID as an airport... Not sure if they are co-located or not but JFK VOR obviously has the same 3 letter as the airport. It seems the 3 letter IATA codes are mainly used with anything related to passengers such as luggage tags, reservations etc. I suppose it's just easier to deal with 3 letters vs 4. The ICAO standard is actually pretty easy to understand as there is actually some sort of format followed around the world. If you are familiar with how it works you can usually figure out what airport an ICAO code stands for and vice versa.

Quoting RobertS975 (Reply 7):
Now, I am an instrument rated commercial pilot (not ATP, not airline) in the US, and every flight plan I have ever filed used 3 letter codes, including cross-border flights to Canada.

I believe the FAA standardized this around 2 years ago that all flight plans must use the 4 letter ICAO. Not sure how you filed but it probably got automatically switched to the ICAO code on the filing strip.
 
bohica
Posts: 2308
Joined: Tue Feb 10, 2004 3:21 pm

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 1:28 am

Quoting Corey07850 (Reply 10):
Quoting RobertS975 (Reply 7):
Now, I am an instrument rated commercial pilot (not ATP, not airline) in the US, and every flight plan I have ever filed used 3 letter codes, including cross-border flights to Canada.

I believe the FAA standardized this around 2 years ago that all flight plans must use the 4 letter ICAO. Not sure how you filed but it probably got automatically switched to the ICAO code on the filing strip.

For most USA airports, the 3 latter IATA code is proceeded with the letter K for the ICAO code. For example JFK = KJFK.
For most (if not all) of Canada the IATA code is proceeded with the Letter C. I'm sure while you're filing your flight plan, the person at the other end is just adding the K or C to the IATA code.
 
Fly2HMO
Posts: 7207
Joined: Sat Jan 24, 2004 12:14 pm

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 2:32 am

Quoting RobertS975 (Reply 7):
But why do we need two different code systems? Why can't we simply use the IATA 3 letter codes all the time?

You have much more possible combinations with ICAO codes, and with the first letter of said codes you can know what part of the world the airport is in.

Quoting RobertS975 (Reply 7):

Now, I am an instrument rated commercial pilot (not ATP, not airline) in the US, and every flight plan I have ever filed used 3 letter codes, including cross-border flights to Canada.

Because you are filing the "wrong" way   

The guy at the FSS just switches the code to ICAO, or the LID.

I always filed ICAO, or LID for smaller uncontrolled airports.
 
KELPkid
Posts: 5247
Joined: Wed Nov 02, 2005 5:33 am

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 4:36 am

Quoting Corey07850 (Reply 10):
I believe the FAA standardized this around 2 years ago that all flight plans must use the 4 letter ICAO. Not sure how you filed but it probably got automatically switched to the ICAO code on the filing strip.

I have always filed the LID for US domestic flights, and ICAO if I"m crossing any borders...then again I haven't flown since 2006   DUATS used to accept that.
Celebrating the birth of KELPkidJR on August 5, 2009 :-)
 
KELPkid
Posts: 5247
Joined: Wed Nov 02, 2005 5:33 am

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 4:53 am

Quoting goldenshield (Reply 9):
I disagree that they are only used for log book entries.

Well, I never said "only..."  
Quoting goldenshield (Reply 9):
These particular idents are meant to for small airfields who 1) don't have an instrument procedure (There are some exceptions here,)

Oh yeah there are...5T6 comes to mind   Has a GPS approach. IIRC, the FAA requirements for an all-letter LID have to deal with weather reporting capabilities (which, in turn, is usually associated with an instrument approach on the field...), however many older fields that already had a three letter LID were "grandfathered in" (like CZK, Cascade Locks, Oregon).

Quoting goldenshield (Reply 9):
and 2) are not ports of entry, i.e.; they are domestic only, and you would not file to them from outside of the U.S.

Once again, 5T6....it was, at one time, a designated airport of entry from Mexico. You could file to it, although US customs required 24 hour prior notification (they had to send customs agents from ELP to drive to the airport). It no longer has airport of entry status, though  
Celebrating the birth of KELPkidJR on August 5, 2009 :-)
 
Fabo
Posts: 1150
Joined: Tue Aug 16, 2005 1:30 am

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 12:44 pm

Similar can be said for airlines. Every airline that has an IATA 2 character code ( OK, DL, AF, LH ) has a 3letter ICAO code (CSA, DAL, AFR, DLH). There are also other operators who do have a 3 letter code, but not 2 letter codes (for ex. air forces).
The light at the end of tunnel turn out to be a lighted sing saying NO EXIT
 
Viscount724
Posts: 19065
Joined: Thu Oct 12, 2006 7:32 pm

RE: Why Are There Multiple Codes For The Same Airport?

Wed Apr 21, 2010 9:40 pm

Quoting Corey07850 (Reply 10):
It seems the 3 letter IATA codes are mainly used with anything related to passengers such as luggage tags, reservations etc. I suppose it's just easier to deal with 3 letters vs 4.
Quoting Fabo (Reply 15):
Similar can be said for airlines. Every airline that has an IATA 2 character code ( OK, DL, AF, LH ) has a 3letter ICAO code (CSA, DAL, AFR, DLH).

And, like the ICAO airport codes, the 3-letter airline codes are used for operational purposes. More than 20 years ago, airlines had almost agreed to change to 3-letter codes for reservations and ticketing purposes since they were running out of 2-letter combinations. However, the cost of making that change was going to be many millions of $$ and the airlines decided to stay with the 2-letter codes. That's when IATA began using alpha-numeric codes (B6, 7F etc.) which significantly increased the number of codes.

Quoting bohica (Reply 11):
For most (if not all) of Canada the IATA code is proceeded with the Letter C. I'm sure while you're filing your flight plan, the person at the other end is just adding the K or C to the IATA code.

A fair number of IATA codes in Canada don't start with Y, usually smaller airports, but some have scheduled service. And there are cases where the ICAO code is unrelated to the IATA code. For example, Vancouver Harbour (the floatplane base in downtown YVR with many scheduled flights) uses the IATA code CXH but the ICAO code is CYHC.

Quoting bohica (Reply 11):
For most USA airports, the 3 latter IATA code is proceeded with the letter K for the ICAO code. For example JFK = KJFK.

Except Alaska and Hawaii where the first letter is P (PHNL, PANC etc.) And IATA and ICAO codes for airports in some U.S. Terriitories (Puerto Rico, US Virgin Islands, American Samoa etc.) are unrelated to each other. Guam is one that seems to follow the Hawaii and Alaska formula, but nearby Saipan doesn't.

Guam - IATA GUM, ICAO PGUM
Saipan - IATA SPN, ICAO PGSN
San Juan - IATA SJU, ICAO TJSJ
St. Croix - IATA STX, ICAO TISX
Pago Pago - IATA PPG, ICAO NTSU

Who is online

Users browsing this forum: TailDragging and 9 guests