Poor data quality can have many consequences: marketing campaigns which don’t meet customer’s needs, misdirected bills and bad decisions for lack of valid information. Our experiences shows that most of the airport operational systems are strong enough to validate and reconcile data from a third party system with their own flight event data. The “only” lack is to get operational airline sources in. Our “Blueprint” describes a smart idea of linking interpreted data from the “other side” into the airport operational system. The customizing of interpreting should only require some basic understanding of programming (eg. in IT department), so that the operational and technical knowledge base will remain at the airport.
As-is: Airports claim about poor data and information gaps
“Poor data quality can have many consequences: marketing campaigns which don’t meet customer’s needs, misdirected bills and bad decisions for lack of valid information”, Martin Bayer wrote in a German article on the website www.cfoworld.de. “In the worst case, the financial base of a company begins to falter – be it because the customers run away, either because the risk of legal violations has been underestimated. The complex task of information management should not be taken lightly.”
The author takes on the “Fight with the data quality. ” He provides tips and reports on trends in this scope. “Departments and board are dependent on reliable and complete data. But many times the data quality is poor. A systematic data management is necessary.”
The for sure right conclusions, Martin Bayer describes, I would now like to consider in the context of airports and thereby draw attention to a difference. The data quality at airports is not only a question of quality of content but also a question whether these sources are linked to other market participants. Sven Solterbeck has advised in recent years many airports, which have hit the bridge successfully.
Data quality is obviously the major deficiency in companies
Bayer has identified some criteria for good data quality: accuracy, consistency, reliability, completeness, accuracy, timeliness, redundancy, relevance, consistency, clarity and intelligibility.
These points are fully applicable to traditional businesses and business sectors, including insurance, banking or finance, materials management and operations control. Integrated production control systems in conjunction with Enterprise Resource Planning systems support the build and operation of databases.
Ensuring data quality at airports means closer integration with data sources of other process participants
The example of airports is accompanied ensuring data quality with the integration of corporate data and those of business partners, such as airlines, ground handling companies, or duty-free shops as well as the coordinators as continental or national air traffic control.
Here are two basic channels of interest:
- Data for primary tasks at the airport (operational control):
Timely inter-company data exchange for example, Collaborative Decision Making (CDM), landside control, resource planning, information display (billboards, Internet, Teletext).
- Data for secondary tasks at the airport (commercial control):
Collection of specific and detailed flight event data for statistics, marketing, route planning and billing of charges.
Data quality in secondary areas of an airport can be sustainably improved
In terms of operations, most airports are well prepared. Many operations are so much linked that a “grain of sand in the gears” can be felt immediately at several internal instances – as well inter-company. Optimization, security and fast response time are important objectives, which provide not only high demands on the operations but also on data quality.
Other hand, my experience shows due to the advice of airports that particular level of detail, accuracy and reliability of data for billing and marketing purposes must be improved:
- Invoices can be prepared faster and easier
- Invoice complaints go to zero
- Easy and timely statistics
- Potential for new routes or the optimization of existing routes through targeted programs
In all these considerations, it generally does not matter where in the airport is located, whether large or small, whether specializing in low-cost airlines or not, whether the focus is directed to air freight or passenger traffic.
No matter – but airlines and airports play a game with different rules: for example, airports have to rely on their invoices to the airlines on derived secondary information (“Excel lists “). For controlling purposes totally inappropriate! The data are not traceable and insufficiently detailed – as are airports not even longer in a position to adopt new need-based fees – Keywords: discount of transfer passengers.
If the airport has detailed passenger and air cargo data available, it could Airline and Airport – together at eye level – put together a package of benefits, which garantees for passengers as well as freight forwarders best service at the airport.
Challenge: Integration of data between airline and airport is simple, highly automatic and goal-oriented
I would like to clarify one issue: The information that you seek are already available somewhere at your airport – whether passengers by class (First, Business, Economy) or weight classes (adults, children, infants), transfer passengers at the airport or at the following route airports , final destinations. Legally, or at least contractual arrangements are the basis for the use of this data.
The challenge for the successful commissioning of an appropriate data collection system is
- to track down those sources,
- to allow for different communication channels and
- the standard of Iata recommendations, the audit-proof as possible interpretation of all variants of these operational messages.
The icing on the cake of such a system is the reconciliation and the consolidation of the so collected flight event specific information collected with the data held by the Airport Operational Database (AODB) and the ERP system.
The experience of the system provider plays a crucial role. Does he know the procedures? Does he know the benchmarks of other airports? How good he is in the first detection of a lack of operational data sources? How good are the links to the airlines and their IT departments?
Our experiences as software developer and IT infrastructure consultant for more than twelve years, lead us to two basic scenarios, which enable airports interpretating automatically operational messages like Load Message (LDM), Movement (MVT), Passenger Transfer Message (PTM) or the list of inbound transfer passengers:
- Cooking down the appropriate figures from these telegrams by a black box, pushing the interpreted numbers into a workflow from handling agent to the airport. In order to have them consolidated in a desognated application with AODB data (flight list) and later to export data to the airport’s ERP.
- Concentrating on interpreting in a flexible and automatic way the figures from the telegrams and let the consolidation with in the airport’s legacy systems (like AODB or ERP).
Latter scenario is for airports with a well designed data architecture, only missing a few elements in their “data jigsaw”. The blueprint could be to store the parser rules for Iata and non-Iata messages in XML formated Regular Expressions. Both technologies are well-known in the IT world. It is up to the airport to maintain the way of how to get the data. Airline customers and the airport can find so a mutual understanding on the numbers, which later billing and planning are relying on. The precious know-how of interpreting messages remains at the airport.
The idea behind this blueprint is to reduce redundancy in administrating flight event data in AODB, ERP and 3rd party database system. A toolkit, based on the blueprint, could be built as a bridge from airline operational data sources to the AODB. Where the so interpreted flight event data can be reconcile with flight schedules and effective flight logbook, before they are ready to be billed or analysed by ERP and data warehouse. Instead of patchwork this blueprint could bring the legacy systems of the market participants close together. Web services in the center of this architecture.
The architecture of such system should be straight forward, transparent and should follow IT standards, e.g. like Microsofts .Net Framework. One Windows service on server side controls the ongoing interpreting of messages (whether e-mail or proprietary file format) in an inbox. A Microsoft SQL database would keep the data. Active Server Pages would present the messages and its computed flight event data via web browser to the client side. This approach saves costs and time in developing and deploying new versions or releases of the service.
The solution behind such blueprint should also regard market best standards of data collection systems. What means, its introduction (implementation, developing interfaces, setting up the infrastructure, testing and training) should last only three month.
Photo credit: Jumbo on the block. (C) by Jörg Hackmann – Fotolia.com, 2010.
Departure Collage. (C) by javarman – Fotolia.com.