Airport codes: where do they come from and why are they needed

Between the airports of Russian Saransk (internal code is CPN) and Copenhagen (international code is CPH) there is more in common than it seems, the main thing is not to confuse where you are going to fly.

Airport abbreviations pop up while searching for tickets, they also appear on ticket spines, baggage tags and displays. Rearranging the initials of the passenger, the air carrier can even pick up the namesake airport for him, while someone else compiles collections with the funniest codes. The principle by which the terminal is encoded differs little from this.

So what is an airport code, why is it needed, and who invented it.

Why airport code

When flights became a worldwide phenomenon, universal methods were needed to exchange information. The international community at least did not want to confuse London in the United Kingdom with London in Canada. To avoid errors, the data transfer should be simple and understandable for pilots and controllers of all countries. Today, everything connected with passenger air transportation (from online booking a flight to printing a boarding pass) is controlled using unique "personal" codes. Any of them can decrypt a special service. Such codes are awarded not only to airports, but also to railway stations, as well as seaports.

What are the codes

Two international airport coding systems, ICAO (ICAO) and IATA (IATA), are considered the main ones.

The ICAO system is used by pilots and air traffic controllers. Its codes are 4 Latin letters, which seem random, but really have a strict structure, linking the code to a specific region.

IATA codes are used in international reservation systems and airport ground services. These are 3 letters of the Latin alphabet, built according to the native principle.

In addition to ICAO and IATA, airports have regional code combinations, but they are used only within a country (in Russia, for example, there are about 3,000 codes in such a system). Like IATA, a regional code of 3 letters (sometimes Cyrillic) was created as a whole, without an incomprehensible layman's internal structure. All codes used are unique.

Airport Kazan. IATA Code: KZN; ICAO code: UWKD; internal code: KZN.

How did codification come about?

In order to optimize the operation of world airports, the International Civil Aviation Organization (ICAO) suggested reducing their designations to a unified code system. Each terminal received a 4-digit code from the letters of the Latin alphabet, built according to specific logic. The compositions turned out to be heavy, but remained in service for a long time: they were used by all the services involved in air transportation. Then ICAO codes were replaced in many areas (for example, for carriers) by a simpler system, but ICAO combinations are still relevant in highly specialized areas such as the exchange of radio navigation, aeronautical information and weather reports.

The code structure is simpler and more understandable for work; its initiator is the International Air Transport Association (IATA). The number of characters in the system was reduced to three, and the code itself was left without an internal structure: the native combination of 3 Latin letters began to be perceived not separately, but as an idiom. Today in the world there are about 12,000 codes, and a possible maximum of combinations is 17,000. Their list with updates and amendments is published twice a year.

Small, often local airports can do without the ICAO code and without IATA. And letter combinations, for some reason irrelevant, can soon be assigned to another airport.

What do the codes mean?

ICAO Code

ICAO codes are designed according to a strict system. They are tied to the regions indicated by the first two characters of the combination.
The first symbol is for a region: as E, for example, Northern Europe is designated, as L - Central and Southern. In the case of large countries, the first letter can immediately indicate the state, for example, K - USA or Y - Australia: there were few logic in the "issue" of letters.
The second character encodes a specific country in the region, which is defined by the first element of the code.

The remaining 2 letters of the combination (3 - for large countries) were added randomly. They pointed to the airport in a particular country in a particular region.

IATA Code

For airports in Canada and the continental United States, IATA codes are “headless” ICAO codes: without the first letter. But in most cases (in the rest of the world, including Hawaii and Alaska), these combinations are not similar.

IATA codes are usually a compressed to 3 letters logical name of the location. For example, Hong Kong is encoded as HKG, Ufa is encoded as UFA, and the Australian airports in Sydney, Melbourne and Perth are designated SYD, MEL and PER by the first three letters in the names of their cities. In addition to cities, airport names are often abbreviated: for example, Roman FCO (Fiumicino) or Parisian ORY (Orly Airport).

Logic does not work without curiosities. The management of the Sioux Gateway airport (Sioux City, USA) received an SUX abbreviation (sounds like English sucks), which is extremely eloquent in the English-speaking region, in response to an application for assigning an IATA code. The association took the best letters from the name of the city. After two futile attempts to change the code, Su-City relaxed and bombarded the slogan - Fly SUX.

A single IATA code may include several airports. Because the cities themselves, where there are more than two large airports, have their own designations. For example, at NYC’s request, you will find flights from all New York airports, because this combination does not work for a specific node, but for everyone within the city. Sometimes the same code designates a large city and one of its airports (more often, the largest or the first one that appears). For example, Saint Petersburg as a whole and Pulkovo Airport have a common LED code.

Internal Codes

Internal combinations were formed so that they had something in common with either the name of the city or its IATA code. But, unlike the latter, when the cities of the former USSR received new names, their codes also changed in the internal system: the same St. Petersburg (Leningrad) in the regional structure was designated as ICE, and became SPT. In the late 1990s, with the expansion of the geography of flights with major Russian carriers, internal cyrillic codes appeared in some foreign cities (for example, BTS - Barcelona).

At the Minsk national airport, the IATA code is MSQ (it sounds like it was the first time to get the combination of MSK to the airport Mastic Point, Greece). ICAO code: UMMS (U - first belonging to the Soviet Union, then - to the post-Soviet region, M - Minsk, MS - airport code). With "U", the international codes of all the former Soviet republics that are now not part of the EU begin. Internal airport code: MIC.

What could go wrong

In one city, several large airports

Then the first element of the IATA code usually defines the city, and the next two - the airport. Three London airports, for example (Heathrow - LHR, Gatwick - LGW, London City - LCY) are encoded according to this principle.

He left for St. Petersburg, came to Leningrad

When a city or airport gets a new name, the assigned code rarely changes: the procedure is costly and complicated. So, Beijing (after the founding of the People’s Republic of China in 1949 - Beijing) is designated as PEK, formerly the name of the city of Peking, and, for example, Vilnius has the designation VNO by the historical name of the city - Vilno.

There is also this: Astana in the international TSE encoding, because Tselinograd, and in the Russian AKL, because Akmola.
Another echo of the past is X in the designations of some US airports (Los Angeles - LAX, Portland - PDX). At first, the airports were codified by the nearest weather station: next to Los Angeles, for example, the station had the code LA. But when the number began to increase and it was necessary to introduce 3-letter abbreviations, they added the letter X to the existing combinations without thinking twice.

ABOUTlater - what will get

The Brisbane identifier is BNE instead of the expected BRI, since Italian Bari is encoded as BRI, and BAR is a small military airport on an island in the Pacific Ocean. The same story with X in the code of the Dubai International Airport (DXB): DUB has already taken Dublin, so he had to take an extra letter in his name.

"C" is for "Canada"

IATA codes of Canadian national airports, whatever the city is called, begin with Y (Vancouver - YVR, Ottawa - YOW). The fact is that with the development of broadcasting, North America was geographically divided into 3 zones, each of which had its own letter identifier. US radio stations broadcasting east of the Mississippi River are W, west is K, and broadcasts from Canada are indicated by Y.

Acronym from another language

Chilean Mataveri International Airport is the most distant from civilization in the world. He is the only one who receives and sends planes on Easter Island. The IATA code for Mataveri Airport is IPC. There is logic, but it is non-standard: the combination came from the Spanish name of the island - Isla de PasCua.

Watch the video: Airport Codes Study Guide (April 2024).

Leave Your Comment