4 minute read

Real Age vs Virtual Age Models for computing Remaining Useful Lives

In this image is an example of where data is being extracted from EASA Form 1. Models are trained not only to recognize forms and fields but also to run rules inherent to the Asset type to identify inconsistencies before the data is imported for further analysis.

When the extraction is complete, behavioral profiles are created to continue the commercial impact journey against relevant use cases. For example, against an engine, trained historic data sets can help complete back-to-birth build gaps by mapping equivalent profiles of Engines that have similar age, utilization profiles, thrust rating and operational environments among other factors.

For an Engine MRO, this can be valuable to predict work scope levels of incoming Engines prior to arrival based on limited or exhaustive data sets. This is crucial in an era of fixed price contracts where NTEP limits can be set and simulated by the FinTwin® to ensure maximum realization of profit margins before slot commitments. The process extends through Gate 0 to save time and effort while inducting an engine and monitors risks and suggests mitigation measures (swaps, exchanges, outsourced repairs, priority swaps through advanced planning and scheduling) to stick to the profitability KPIs.

This is powered through Aviation semantic layers on top of language models that have processed and benchmarked shop visit reports by engine type and

The curved line is the Failure Rate Curve. The start of that curve on the left is for manufacturing defects. There is a threshold for that so that, when those defects have been fixed, one expects that, once you cross that threshold, the component will have a normal life. Given that there is wear and tear on that part throughout its usage, the failure rate towards the end of its life exponentially increases because no further repairs are possible. This is what the FinTwin® models, using the data which has been extracted.

The FinTwin® computes the cost related to these factors. Factors around macro-economics, inflation rate and interest rates are taken in to account when projecting costs in the short, medium and long terms. KeepFlying® has already published a paper where we have compared deep learning models with the models that we use which can accurately predict the costs and cashflows against Assets. The feature with deep learning models is that they are data hungry, but there are some mathematical models which can resolve this.

Here is an example of one such algorithm, the Dynamic Mode Decomposition, shown below figure 5.

Future Value calculationDM s uD sing DMD (Dynamic Mode Decomposition

How can Data Models improve?

The

In Bayesian Inference you have a prior assumption, observe what is actually happening and, based on that observation, you update your prior assumption. The main goal is that you take this data and provide the commercial financial impact of trade and maintenance visit decisions. All this is possible because of deep data which is being handled. To handle this data, given the sheer scale at which we are working and the sensitivity of the data we are using, there must be data governance and data privacy aspects to make this whole data modelling in practice.

Data Governance

We know that whatever data we generate will have a dollar value so, when you’re creating terabytes of data, what really matters is the governance of that data to gain full advantage of it. (figure 7).

The Preamble : Data Governance

Data governance is the process of managing the availability, usability, integrity and security of the data in enterprise systems, based on internal data standards and policies that also control data usage

It also essentially means putting the right technologies, policies, and people in place to ensure accuracy, quality, and security for massive sets of data.

Basically, it is used in fluid dynamics to simulate flow around an airfoil. This can be extended for inflation rates and interest rates, time dependent systems which flow into the future.

All of this is based on historic data and current projections but what about when an anomaly sets in? This is mitigated and improved by a particular aspect known as Bayesian Inference (figure 6).

Data governance is the process of making your data available to your consumers in a secure way while maintaining data integrity and privacy. There is a need to create a policy or process which helps you to manage your data efficiently and securely. The reason why data governance is of importance to an organization, especially in the aviation domain, can be seen through these few highlighted problems (figure 8).

Familiarize: Data Governance

The first problem is that, when you’re creating huge amounts of data and you must store it in a location but don’t have any management tools or anything to organize it: that creates a Data Swamp. The data is just lying there and not being leveraged for any useful purpose. What is needed is a proper policy to manage and use this data and that is one example of where data governance is needed.

The second problem can be that, in any organization, you create data in one business unit but when it’s passed to another business unit, they don’t understand that data because there is no shared knowledge of it. That is a problem that many industries face. You need a shared knowledge between business units to be able to share the data vision. If there is no policy, it will be difficult to consume the generated data. This is amplified by the effect that supply chain and labor challenges have in Aircraft & Engine Maintenance profitability.

Another problem is that there are a lot of acronyms in aviation and it’s not possible for a human or a system to understand every acronym. So, a proper definition of those acronyms should be in place.

Each country or region will have compliance and data residency laws associated with it. You need to adhere to those laws. Data governance enables us to be compliant with such laws.

Figure 9 shows results from an interesting survey conducted with multiple organizations and their employees. What came out top are security of data, data quality and data governance.

There are multiple ways in which data governance can be implemented but what we are trying to project here is one of the best architectures called Data Lakehouse architecture (figure 10).

This article is from: