You can train an AI to become perfect in playing Mario Kart. You can also train it to prioritize information for pilots. When you watch your AI playing Mario Kart, you are amazed how good it is but you don’t have the slightest idea how exactly it configured itself in the training phase to learn these skills, and you don’t really care.When you watch your AI prioritizing flight information perfectly and you have no idea how it does it, you can go and play Mario Kart yourself, because no regulator is ever going to certify it.
AIs in aviation cannot be black boxes. The user needs to understand its “thought processes”, because there is too much at stake to blindly trust the decisions of machines. Training data can be incomplete, biased, or just plain wrong, or the AI could be based on an inadequate model for its use case.The problem is, AIs are complex. This is why they are so good in learning new skills. Making them explainable is not an easy task, but if we managed to get there, the possibilities are huge.