Aircraft IT OPS Issue 64: Q2 2025

Subscribe
Aircraft IT OPS Issue 64: Q2 2025 Cover

Articles

Name Author

WHITE PAPER: Beyond the Checklist

Author: Klaus Olsen, EFB Admin Services, CEO/EFB Administrator

Subscribe

Klaus Olsen, EFB Admin Services Int. AS Rethinking EFB Testing, Documentation, and Compliance

Electronic Flight Bags (EFBs) have become indispensable in modern flight operations, transforming how pilots access data, perform calculations, and interact with systems. However, moving from concept to regulatory approval is rarely straightforward as figure 1 illustrates.

Figure 1

Airlines must navigate complex requirements across hardware, connectivity, risk management, and training – all while maintaining stringent documentation. At EFB Admin Services Int. AS, we’ve supported operators worldwide in meeting EASA, FAA, and ANAC standards. And what often delays or derails approval isn’t a failure in technology – it’s the failure to document and communicate testing results effectively.

REGULATORY FRAMEWORK AND COMPLIANCE STRATEGY

Understanding and meeting regulatory requirements is foundational to EFB approval. Compliance is not just a destination; it’s a discipline that must be embedded into every stage of your EFB project lifecycle. While the core objectives – ensuring safety, reliability, and traceability – are shared across authorities, each regulatory body introduces unique nuances in how these outcomes are demonstrated and documented as you’ll see from figure 2. It’s important to understand the subtle differences between EASA, FAA, and ANAC expectations.

Figure 2

Under EASA, AMC1.SPA.EFB.100(b) requires detailed safety risk assessment, human-machine interface (HMI) evaluation, and clear administrative and training procedures for both the EFB device and applications. Operators must establish an EFB program that includes risk identification, mitigation strategies, system suitability assessments, and documentation of operational use. Notably, EASA mandates an operational evaluation phase culminating in a Final Operational Report, summarizing performance, pilot feedback, risk treatment outcomes, and readiness for CAA oversight.

[caption] Regulatory Collaboration: EASA, FAA, and ANAC discuss EFB standards; ‘different rules, same goals’

FAA AC 120-76D aligns closely with these goals but emphasizes a structured EFB program governed under Ops Spec A061. It categorizes EFB software into Type A and Type B applications, each with differing approval pathways. Type B applications require operator-specific authorization and inclusion in operational documentation. While the FAA does not explicitly require an Operational Evaluation Test by that name, it does call for documented validation of usability, training, software compatibility, and equipment readiness before deployment.

The ANAC guidance under NTA Part 6 and Section 6.047 echoes both EASA and FAA expectations but also requires a formal evaluation of EFB equipment and any associated mounting devices to verify they do not interfere with aircraft systems. ANAC also stresses procedural clarity – particularly around failure modes, redundancy, and user training. A Final Operational Report, while not explicitly defined by title, is implied through the requirement to demonstrate compliance with all criteria before approval. ANAC’s focus on portable devices and contingency planning is especially relevant for developing markets.

COMPARISON OF REGULATORY REQUIREMENTS FOR OPERATIONAL EVALUATION AND FINAL DOCUMENTATION

The table below summarizes how EASA, FAA, and ANAC approach the need for an operational evaluation phase and the associated final documentation requirements.

RequirementEASA (AMC1.SPA.EFB.100(b))FAA (AC 120-76D)ANAC (NTA Part 6)
Safety Risk Assessment (SRA)Mandatory, documented in risk matrix and linked to proceduresRequired as part of operator’s EFB program under Ops Spec A061Required, with emphasis on function-specific risk and backup measures
Human Factors / HMI ReviewExplicit requirement to assess HMI based on human factors principlesPart of functional usability validation but less prescriptiveRequired, tied to operational usability and crew interaction
Operational Evaluation TestMandated test phase with multiple flights across configurations and scenariosEncouraged through validation processes, not formally namedExpected via procedural review and practical use demonstration
Final Operational ReportMandatory summary report including findings, SRA outcome, and readinessOperator-developed documentation confirming validation resultsImplied requirement to document compliance before approval
Training & Admin ProceduresMust be defined in the EFB management system and reviewed for approvalIncluded in EFB program manual and reviewed during certificationOperator must document all use procedures and training steps

OPERATIONAL EVALUATION TESTING: THE BACKBONE OF EFB APPROVAL

All regulators require certainty that an EFB and its associated equipment will not interfere with safe aircraft operation. To demonstrate that, one of the most important components of any EFB approval is the Operational Evaluation Test (OET). It’s where theoretical risk assessment meets real-world usage. This structured testing phase, mandated in EASA AMC3 SPA.EFB.100(b), verifies that your EFB performs safely, consistently, and reliably under normal and abnormal operating conditions. Figure 3 shows what is required.

Figure 3

A robust evaluation framework typically includes:

  • Test Duration and Scope: Usually six months or a defined volume of flights per fleet type.
  • Aircraft and Software Matrix: Identifying variances in aircraft configurations, EFB software, mounts, and connectivity methods.
  • Pilot Involvement: Multiple crew profiles across various bases and routes to ensure diversity in user experience.
  • Scenario Testing: Simulations of degraded connectivity, failed devices, login issues, and alternative workflows.
  • User Feedback Loops: Collection of structured reports, quick surveys, debriefs, and iterative updates to the manual or procedure.
  • Final Operational Report: A summary document capturing findings, SRA review, recommendations, and CAA-readiness.

[caption] Onboard Evaluation of EFB Devices During Operational Testing

[caption] Pilots using EFBs in active flight

COLLABORATION: THE CROSS-DEPARTMENT EFFORT

One of the most overlooked success factors in EFB projects is collaboration. Too often, the IT team leads the technical build, while the Flight Ops team owns the procedures – with little structured overlap. True compliance, however, emerges when everyone contributes:

  • Chief Pilots ensure SOP alignment and line crew buy-in.
  • Technical Pilots assist in hardware validation and HMI optimization.
  • IT and MDM Specialists handle deployment, remote wipe, lockdown, and updates.
  • Maintenance Teams manage mounts, charging systems, and fault reporting.
  • Safety and Compliance Officers monitor risk documentation and audit-readiness.

In successful projects, these groups communicate regularly, share documentation, and review changes as a team – not just during CAA audits, but during the build itself.

FINAL OPERATIONAL REPORT: CLOSING THE LOOP

In our experience, a common pitfall among operators is neglecting to properly conclude the testing phase with a structured Final Operational Report. Many operators invest months in flying evaluations, only to falter when it comes time to formally summarize outcomes and submit a complete report. Without it, even a perfect implementation can receive a negative audit mark.

This document, example shown in figure 4, should not only summarize test outcomes but also demonstrate compliance with the initial risk assessments and regulatory expectations.

Figure 4

It effectively ‘closes the loop’ by turning evidence from the operational evaluation into actionable findings and permanent records. From our experience, the most effective Final Operational Reports include the following:

  • A summary of aircraft, route, and crew configurations tested;
  • Key incidents or failure scenarios encountered and mitigated;
  • Results from pilot surveys or debriefs;
  • Cross-reference to risk matrices and software/hardware configuration logs;
  • Recommendations for final procedural updates or training needs;
  • Declaration of readiness for authority inspection or audit.

Whether under EASA, FAA, or ANAC, the absence of this final document can delay or invalidate the approval process – even when all testing has been successfully performed. Regulatory auditors want to see traceability, repeatability, and review. The Final Operational Report is your proof.

HUMAN FACTORS AND CREW INTERFACE

EFB usability can make or break operational safety. That’s why every major authority requires human factors to be assessed during the EFB approval process. This includes screen readability under different lighting, logical application navigation, reduced cognitive workload, and compatibility with the aircraft environment.

Based on FAA’s human factors guidance and AMC1 to SPA.EFB.100(b), best practices include ensuring display brightness control, avoiding cluttered app interfaces, using large and legible fonts, and providing intuitive navigation paths for checklist and chart access. A poorly optimized interface increases the chance of pilot error – especially under pressure or in degraded conditions.

Testing should include both simulated use cases and in-flight feedback collection. Configurations should be revisited when deploying updated software versions or migrating to new device types.

THE PROBLEM WITH OFF-THE-SHELF MANUALS

An equally common compliance hurdle today is the reliance on generic, pre-written EFB manuals. Increasingly, we see airlines acquire an ‘EFB Policy & Procedures Manual’ from a supplier or consultant long before they’ve finalized their device platform, app ecosystem, or mounting strategy. In several recent cases, we’ve had to completely rewrite such manuals to address regulator rejections. The reason is always the same: they don’t reflect how theairline operates.

A compliant manual must:

  • Reference for actual hardware types (iOS, Windows, Class 1/2/3 distinctions).
  • Document application setup and data update workflows.
  • Define operational procedures for login, power management, software versions, and issue escalation.
  • Align with safety risk assessment outcomes and human factors considerations.

Additionally, your manual should be a living document; reviewed annually (or after significant changes), and refined based on pilot feedback, incidents, or system upgrades.

CASE IN POINT: A NEAR-MISS DURING EASA OVERSIGHT

A client of ours faced this scenario. Their local CAA had issued EFB approval based on documented risk assessments and a smooth operational rollout. However, during a routine EASA audit of the authority itself, the airline was chosen as a reference operator.

While everything in practice was correct, one element was missing: the Final Operational Report. The CAA had not enforced its submission, but EASA flagged its absence as a major finding. We were called in to prepare the missing document retroactively – cross-referencing logs, checklists, meeting notes, and test summaries to reconstruct the final assessment and satisfy the audit team. It was a wake-up call. Doing the work is not enough, proving you did it, in the right format and language, is what counts in regulatory processes.

COMMON PITFALLS AND HOW TO AVOID THEM

Here are five recurring mistakes we’ve observed during airline audits and CAA review:

PitfallPrevention Tip
No final report after operational testIntegrate report drafting into project timelines and approval checklists.
Purchased ‘template’ manualsTailor documents to reflect your hardware, procedures, and organizational roles.
Weak or generic SRAsUse structured formats with clear hazard definitions and mitigation tracking.
HMI/UX issues during line opsInclude actual crew input, beta testing, and interface feedback in your evaluation.
Poor gate or inflight connectivityRun real-world tests at airport gates and simulate degraded operations.

Having a well-prepared Safety Risk Assessment and Final Report isn’t just about ticking regulatory boxes, it creates resilience when audits, system updates, or airline expansions occur.

CONCLUSION AND LOOKING AHEAD

The role of technology in EFB compliance is changing rapidly; the next frontier of EFB testing will be smarter, not slower. Where paper manuals and PDFs once ruled, airlines are now exploring AI and machine learning for intelligent automation, predictive analysis, and decision support.

Regulatory compliance for EFBs is a living process. It begins with risk assessment, progresses through hardware and human factors evaluations, and culminates in the operational testing and final documentation phases. Skipping or simplifying any of these steps often leads to delays, findings, or rework. A compliant EFB implementation is not something you ‘install’, it’s something you maintain. From project inception to post-deployment updates, success depends on process ownership, documentation clarity, and multi-stakeholder collaboration.

Looking forward, emerging technologies such as AI-assisted testing, predictive analytics, and embedded cybersecurity frameworks will play a larger role in streamlining EFB validation. But even as tools evolve, the core requirement remains prove that your EFB solution works, supports safe operations, and can be verified at every step.

[caption] Futuristic EFB with AI-Powered Predictive Analytics

At EFB Admin Services Int. AS, we continue to support operators across the world in navigating these challenges efficiently and confidently. Because beyond the checklist lies a new level of operational assurance.

Comments (0)

There are currently no comments about this article.

Leave a Reply

Your email address will not be published. Required fields are marked *

10 + eight =

To post a comment, please login or subscribe.