Authentic assessment is a type of assessment in which the student is asked to perform real-world tasks, and demonstrate a meaningful application of skills and competencies. Authentic assessment lies at the heart of training today’s aviation student to use critical thinking skills. Rather than selecting from predetermined responses, students must generate responses from skills and concepts they have learned. By using open-ended questions and established performance criteria, authentic assessment focuses on the learning process, enhances the development of real-world skills, encourages higher order thinking skills (HOTS), and teaches students to assess their own work and performance.
There are several aspects of effective authentic assessment. The first is the use of open-ended questions in what might be called a “collaborative critique,” which is a form of student-centered grading. As described in the scenario that introduced this chapter, the instructor begins by using a four-step series of open-ended questions to guide the student through a complete self-assessment.
Replay—ask the student to verbally replay the flight or procedure. Listen for areas in which the instructor’s perceptions differ from the student’s perceptions, and discuss why they do not match. This approach gives the student a chance to validate his or her own perceptions, and it gives the instructor critical insight into his or her judgment abilities.
Reconstruct—the reconstruction stage encourages the student to learn by identifying the key things that he or she would have, could have, or should have done differently during the flight or procedure.
Reflect—insights come from investing perceptions and experiences with meaning, requiring reflection on the events. For example:
- What was the most important thing you learned today?
- What part of the session was easiest for you? What part was hardest?
- Did anything make you uncomfortable? If so, when did it occur?
- How would you assess your performance and your decisions?
- Did you perform in accordance with the PTS?
Redirect—the final step is to help the student relate lessons learned in this session to other experiences, and consider how they might help in future sessions. Questions:
- How does this experience relate to previous lessons?
- What might be done to mitigate a similar risk in a future situation?
- Which aspects of this experience might apply to future situations, and how?
- What personal minimums should be established, and what additional proficiency flying and/or training might be useful?
The purpose of the self-assessment is to stimulate growth in the student’s thought processes and, in turn, behaviors. The self-assessment is followed by an in-depth discussion between the instructor and the student, which compares the instructor’s assessment to the student’s self-assessment. Through this discussion, the instructor and the student jointly determine the student’s progress on a rubric. As explained earlier, a rubric is a guide for scoring performance assessments in a reliable, fair, and valid manner. It is generally composed of dimensions for judging student performance, a scale for rating performances on each dimension, and standards of excellence for specified performance levels.
The collaborative assessment process in student-centered grading uses two broad rubrics: one that assesses the student’s level of proficiency on skill-focused maneuvers or procedures, and one that assesses the student’s level of proficiency on single-pilot resource management (SRM), which is the cognitive or decision-making aspect of flight training.
The performance assessment dimensions for each type of rubric are as follows:
Maneuver or Procedure “Grades”
- Describe—at the completion of the scenario, the student is able to describe the physical characteristics and cognitive elements of the scenario activities, but needs assistance to execute the maneuver or procedure successfully.
- Explain—at the completion of the scenario, the student is able to describe the scenario activity and understand the underlying concepts, principles, and procedures that comprise the activity, but needs assistance to execute the maneuver or procedure successfully.
- Practice—at the completion of the scenario, the student is able to plan and execute the scenario. Coaching, instruction, and/or assistance will correct deviations and errors identified by the instructor.
- Perform—at the completion of the scenario, the student is able to perform the activity without instructor assistance. The student will identify and correct errors and deviations in an expeditious manner. At no time will the successful completion of the activity be in doubt. (“Perform” is used to signify that the student is satisfactorily demonstrating proficiency in traditional piloting and systems operation skills).
- Not observed—any event not accomplished or required.
For example, a student can describe a landing and can tell the flight instructor about the physical characteristics and appearance of the landing. On a good day, with the wind straight down the runway, the student may be able to practice landings with some success while still functioning at the rote level of learning. However, on a gusty crosswind day the student needs a deeper level of understanding to adapt to the different conditions. If a student can explain all the basic physics associated with lift/drag and crosswind correction, he or she is more likely to practice successfully and eventually perform a landing under a wide variety of conditions.
Single-Pilot Resource Management (SRM) “Grades”
- Explain—the student can verbally identify, describe, and understand the risks inherent in the flight scenario, but needs to be prompted to identify risks and make decisions.
- Practice—the student is able to identify, understand, and apply SRM principles to the actual flight situation. Coaching, instruction, and/or assistance quickly corrects minor deviations and errors identified by the instructor. The student is an active decision maker.
- Manage-Decide—the student can correctly gather the most important data available both inside and outside the flight deck, identify possible courses of action, evaluate the risk inherent in each course of action, and make the appropriate decision. Instructor intervention is not required for the safe completion of the flight.
In SRM, the student may be able to describe basic SRM principles during the first flight. Later, he or she is able to explain how SRM applies to different scenarios that are presented on the ground and in the air. When the student actually begins to make quality decisions based on good SRM techniques, he or she earns a grade of manage-decide. The advantage of this type of grading is that both flight instructor and student know exactly where the student learning has progressed.
Let’s look at how the rubric in Figure 5-4 might be used in the flight training scenario at the beginning of this chapter. During the postflight debriefing, CFI Linda asks her student, Brian, to assess his performance for the day, using the Replay – Reconstruct – Reflect – Redirect guided discussions questions described in the Collaborative Assessment subsection. Based on this assessment, she and Brian discuss where Brian’s performance falls in the rubrics for maneuvers/procedures and SRM. This part of the assessment may be verbally discussed or, alternatively, Brian and Linda separately create an assessment sheet for each element of the flight.
When Brian studies the sheet, he finds “Describe, Explain, Practice, and Perform.” He decides he was at the perform level since he had not made any mistakes. The flight scenario had been a two-leg Instrument Flight Rules (IFR) scenario to a busy class B airport about 60 miles to the east. Brian felt he had done well in keeping up with programming the GPS and MFD until he reached the approach phase. He had attempted to program the Instrument Landing System (ILS) for runway 7L and had actually flown part of the approach until air traffic control (ATC) asked him to execute a missed approach.
When he compares the sheet he has completed to Linda’s version, Brian discovers that most of their assessments appear to match. An exception is the item labeled “programming the approach.” Here, where he had rated the item as “Perform,” Linda had rated it as “Explain.” During the ensuing discussion, Brian realizes that he had selected the correct approach, but he had not activated it. Before Linda could intervene, traffic dictated a go-around. Her “explain” designation tells Brian that he did not really understand how the GPS worked, and he agrees.
This approach to assessment has several key advantages. One is that it actively involves the student in the assessment process, and establishes the habit of healthy reflection and self-assessment that is critical to being a safe pilot. Another is that these grades are not self-esteem related, since they do not describe a recognized level of prestige (such as A+ or “Outstanding”), but rather a level of performance. The student cannot flunk a lesson. Instead, he or she can only fail to demonstrate a given level of flight and SRM skills.
Both instructors and students may initially be reluctant to use this method of assessment. Instructors may think it requires more time, when in fact it is merely a more structured, effective, and collaborative version of a traditional postflight critique. Also, instructors who learned in the more traditional assessment structure must be careful not to equate or force the dimensions of the rubric into the traditional grading mold of A through F. One way to avoid this temptation is to remember that evaluation should be progressive: the student should achieve a new level of learning during each lesson. For example, in flight one, the automation management area might be a “describe” item. By flight three, it is a “practice” item, and by flight five, it is a “manage-decide” item.
The student may be reluctant to self-assess if he or she has not had the chance to participate in such a process before. Therefore, the instructor may need to teach the student how to become an active participant in the collaborative assessment.