AFTER ACTION REVIEW

THE POWER OF AFTER-ACTION REVIEWS

In 2013, I took a position as training director of a crisis hotline. The majority of call agents worked remotely from locations all around the world. Even though I was an experienced director, I had not worked with remote teams before this position.

Since Google offered a free subscription to nonprofits, we decided to use Google Hangouts for training. Each trainer learned how to use Hangouts and we held a special “train the trainers” phase before the official launch of the program.

After Action Review

What was expected to happen

Our main goal was to get our training online, using Google Hangouts so we can share documents, presentations, and see each other’s screens. The expectation was that we might have some problems, but we would get the training program online and continue learning how to conduct meetings on this technology.

What actually happened or occurred?

1. The team dynamics were very good. Everyone worked hard to learn Google Hangouts and how to lead a virtual training session.

2. From a technology perspective, we did not have a smooth start. One trainer in South Africa had tech problems he didn’t know how to resolve, and our tech support staff member was unavailable. Later we decided as a group that since it’s so difficult to schedule people for a training session, in the future if one person is having technology problems, we will continue working, but make sure to give detailed meeting notes to that person. We also decided to record all training sessions as well.

What went well?

1. Google Hangouts for Business was a good choice for virtual training. The additional tools within the program (whiteboard, screen share, present screen) worked well with the content of the training.

2. Using ice-breaker exercises helped trainees get comfortable with each other and helped the team dynamics. They also helped the trainees become acclimated to training in a virtual space.

What can be done to make the process better?

1. To accommodate members who are unable to join our meeting, we should take minutes and record each training session. The notes and recordings should be posted on the employee section of the website.

2. We did not have a clear protocol on what we should do in case someone is unable to join a meeting. A protocol should be established and added to the employee training handbook and sent out with the training invites.

3. Some trainers are not comfortable or experienced with Hangouts. Google provides video-based training that can be used for any trainer who is struggling with the program.

Conclusion

Even though the virtual training program was successful, the process could have benefited from an AAR during the train the trainer phase. Many issues that we encountered in this phase continued to plague the training program for the first year. My team did learn and grow through the process, but using a reflect-plan-act system would have led to better equipped trainers, improved learning outcomes for the trainees, and a much stronger training program overall.

Share this post

You might be interested in...

Using the Right Bloom’s Level for Course Content

As an instructional designer and educator, understanding and using Bloom’s revised taxonomy is crucial to creating significant learning experiences. An important part of the course design process is aligning the Bloom’s levels with the topic. If you’re teaching spelling, then the majority of learning will be in the lower levels. The intended outcome is to have a student know the facts: how to spell cat, or dog, or even taxonomy. At this level, you would use a very simple assessment tool.

“Take out your paper…number from 1-20…#1 – spell the word CAT.”

However, if you’re teaching a college level philosophy class, your focus is on the higher order levels. Assessing the student’s learning requires more than a simple test that is focused on recalling facts. Unlike a spelling test, a philosophy assessment might be a critique of Plato’s allegory of the Cave. Now that’s critical thinking! (and possibly a headache from thinking too much)

It’s not always easy to determine the appropriate Bloom’s level for the learning outcome. When I work with an instructor, I find it helpful to use a simple explanation of each Bloom’s level. This allows the instructor to evaluate the content and decide if the level is appropriate.

Level 1: Remember

This level helps us recall foundational information and/or facts: names, dates, formulas, and definitions.

Level 2: Understand

Understanding means that we can explain the main ideas and concepts of a topic, and translate that into meaning by interpreting, classifying, summarizing, inferring, comparing, and explaining.

Level 3: Apply

Application allows us to recognize and use the concepts in real-world situations as well as when, where, or how to employ methods and ideas.

Level 4: Analyze

Analysis means breaking a topic or idea into components or examining a subject from different perspectives. It helps us see how the “whole” is created from the “parts.” Analysis helps reveal the connections between facts.

Level 5: Synthesize

Synthesizing means considering individual elements together for the purpose of drawing conclusions, identifying themes, or determining common elements. Here you want to shift from “parts” to “whole.”

Level 6: Evaluate

Evaluating means making judgments about something based on criteria and standards. This requires checking and critiquing an argument or concept to form an opinion about its value. Often there is not a clear or correct answer to this type of question. Rather, it’s about making a judgment and supporting it with reasons and evidence.

Level 7: Create

Creating involves putting elements together to form a coherent or functional whole. Creating includes reorganizing elements into a new pattern or structure through planning. This is the highest and most advanced level of Bloom’s Taxonomy.*

References

Anderson, L. W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Wittrock, M.C (2001). A taxonomy of learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman.

Share this post

You might be interested in...

Justifying e-learning to faculty

Numerous articles, studies, and reports point to the remarkable growth of online education and the difficulty of justifying eLearning to faculty members. While the two articles I selected confirm this resistance, a quick Google search illustrates just how pervasive faculty resistance is toward online education. According to the Babson Survey Research Group report (Allen and Seaman, 2017), there were 6.3 million students (1 in 4 students) enrolled in online undergraduate courses in 2016. Given the demand for online classes, it is imperative that higher education leadership and instructional designers employ strategies to justify eLearning and overcome faculty resistance.

According to a 2016 Wiley Education article, 58% of faculty members are skeptical of eLearning based on the belief that online courses produce results that are inferior to traditional face-to-face instruction. Overcoming this fear involves not only presenting the data that shows the effectiveness of online courses but also in providing the faculty members information related to their institution’s online courses. General statistical information is essential but seeing the numbers that represent their school gives the instructors a personal connection to the evidence. The Babson survey confirms this stating that “the academic leaders who had favorable opinions of online learning had some direct exposure to the medium” (Allen and Seaman, 2017). Given the information in this article, the Babson survey, and my own experience with faculty, I agree that showing faculty members that eLearning is effective in meeting learning objectives is crucial to justify online education.

In a 2018 Inside Higher Ed article (Liberman, 2018), several online learning leaders discussed ways to justify eLearning to resistant faculty. Paul Krause, CEO of Cornell University’s online learning department, emphasized a crucial element that is often overlooked in this conversation. It is important to begin with an understanding that many college educators have either limited experience with online learning or only programs that were poorly designed. Faculty perceptions that are shaped by negative information and experiences must be addressed. Exposing the instructor to high-quality courses, institutional standards of excellence in online education, and showing your own commitment to creating significant learning in an eLearning environment are essential to justify online classes for faculty.

Each article shows how crucial it is that we are able to justify eLearning. As I read each article, I noticed one element missing from the conversation: empathy. Taking an empathetic approach produces more significant results than working from your own presuppositions about those who resist eLearning. Recognizing their contributions, expertise, and value show the instructor that you respect them. Faculty are just like everyone else; change is scary. They have strong emotional ties to traditional instruction that may supersede their ability to view change logically. Their technological skills may not be up to par, and they fear failing as an online educator.

As an ID, I think that practicing what we preach is a critical aspect of justifying eLearning to resistant faculty. Course design begins with learner analysis to understand the learner—their prior experience, values, beliefs, and attitudes. It is conducted through an objective lens with the goal of designing learning that meets the learner where they are. Similarly, performing a faculty analysis provides insight into the reasons behind their resistance. It removes the biases we may have and allows us to see the individual in the light of empathy and compassion. Like any change management attempt, building rapport, establishing trust, and listening to the faculty member can bring down the walls preventing communication. Once this occurs, framing the conversation for that individual in a way that validates their experience while appealing to their passion for education has the power to create acceptance for change.

Articles
Building Trust: How to Address Faculty Concerns about Online Education. 2016.Wiley Education.
https://edservices.wiley.com/wp-content/uploads/2016/08/WES-Playbook_Building-Trust-How-to-Address-Faculty-Concerns-About-Online-Ed.pdf

Liberman, Mark. 2018. Overcoming Faculty Resistance – Or Not. Inside HigherEd. https://www.insidehighered.com/digital-learning/article/2018/03/14/experts-offer-advice-convincing-faculty-members-teach-online-or?mc_cid=0675f4d198&mc_eid=5c54a08240

References
Allen, Elaine and Jeff Seaman. 2017. Digital Learning Compass: Distance Education Enrollment Report. Babson Survey Research Group.

Image Source: iStock

 

Share this post

You might be interested in...

Assessment Strategies

Understanding the connection between formative and summative assessments, and the how formative assessments should help the learner do well on the summative assessment, is vital for an instructional designer.

In the video shown below, the author shares an easy-to-understand description of both formative and summative assessments. From our previous coursework, and my job as an ID at Samford, I am familiar with and exposed to assessments regularly. But, I have not considered how the two types work in tandem to determine the extent students are successfully meeting the course objectives.

Understanding that formative assessments should work as a checkpoint for both the learner and the instructor was a “light-bulb” moment for me. I recognized the value of feedback for the learner, but had not considered that a formative assessments offers critical information and feedback for the instructor as well. As the instructor stops at the formative assessment checkpoint, they have an opportunity to evaluate their own instructional strategy and determine whether it is effective in achieving the learning objectives. From the learner and instructor perspective, formative assessments are essential to creating significant learning.

Before I began the IDTE program, my educational experience included mostly summative assessments. Now, I realize how valuable quality formative assessments are in learning. I can point to several undergrad courses where the feedback from a few formative assessments would have not only made my work a little easier, but I would have learned so much more.

I think that having a specific checkpoint where I can stop, receive expert feedback from my trusty guide, then make adjustments based on that feedback have made my IDTE learning journey much richer and more satisfying. Plus, it has helped me embed the learning in my mind. The quest for learning requires both types of assessment for the learner to meet the objectives successfully.

Source: Gary Wright

Share this post

You might be interested in...

Chunky Learning

Content chunking involves organizing information in “chunks” so that it’s easier for learners to digest. Instead of memorizing multiple concepts, online learners are able to analyze each concept thoroughly and absorb the content, one bite at a time. Once they’ve assimilated the content, they move onto the next concept.

Chunking stems from the field of cognitive psychology. Science indicates that our working memory can only hold a finite number of items, and when it reaches full capacity, we experience cognitive overload.

However, organizing the information into chunks takes stress off the mental pathways, making it easier to remember the learning concepts. Chunking learning content allows the instructor to define an objective and then organize the content, activities, assessments, and other learning tools in segments that support the objective.

How can an instructional designer apply chunking in course design?

1. Make sure the course begins with specific goals and objectives. Well-designed objectives are the keystone of each chunk. Canvas makes this very easy with their Module-based design. 

2. Stay within the limits of cognitive capacity. According to science, our brain can only process 3-5 pieces of information at a time. It’s like memorizing a telephone number. Our mind can remember 205-251-1587 easier than 2052511587. 

3. Create a content map during the design phase. This allows you to categorize your content and break it down into relevant modules. Instructional designers create a list of the desired outcomes or goals, then list 3-to-5 related concepts for each. Each concept should tie into the learning objective. The next step is gathering all relevant assets, including eLearning activities, online assessments, and multimedia. Finally, organize the content by how they fit into the big picture and support the learning outcomes.

As I mentioned earlier, the Module concept is one of the great features within Canvas. (Northwestern University’s sample course is an excellent example of chunking.) Instructors can organize content based on the module goals and objectives, students are able to locate content with ease, and it helps the student see how the connection between concepts.
In my experience as an online learner, some classes really stimulated my mind and I retained the information. I also had classes that were nothing more than memorizing information for a test. It’s no surprise that the former course design was based on chunking, and the latter was more like ‘dump truck design’. (Yes. I just invented that phrase.) Essentially, the instructor backed up the truck and dumped all the information on the students. No discussions. No relevant structure. Just information in large quantities.
If you want to learn more about chunking, check out the links below.

You can read more and view examples of designing using content chunking on the Nielsen Norman Research Group site.

George A. Miller’s Information Processing Theory

Share this post

You might be interested in...