Using the Right Bloom’s Level for Course Content

As an instructional designer and educator, understanding and using Bloom’s revised taxonomy is crucial to creating significant learning experiences. An important part of the course design process is aligning the Bloom’s levels with the topic. If you’re teaching spelling, then the majority of learning will be in the lower levels. The intended outcome is to have a student know the facts: how to spell cat, or dog, or even taxonomy. At this level, you would use a very simple assessment tool.

“Take out your paper…number from 1-20…#1 – spell the word CAT.”

However, if you’re teaching a college level philosophy class, your focus is on the higher order levels. Assessing the student’s learning requires more than a simple test that is focused on recalling facts. Unlike a spelling test, a philosophy assessment might be a critique of Plato’s allegory of the Cave. Now that’s critical thinking! (and possibly a headache from thinking too much)

It’s not always easy to determine the appropriate Bloom’s level for the learning outcome. When I work with an instructor, I find it helpful to use a simple explanation of each Bloom’s level. This allows the instructor to evaluate the content and decide if the level is appropriate.

Level 1: Remember

This level helps us recall foundational information and/or facts: names, dates, formulas, and definitions.

Level 2: Understand

Understanding means that we can explain the main ideas and concepts of a topic, and translate that into meaning by interpreting, classifying, summarizing, inferring, comparing, and explaining.

Level 3: Apply

Application allows us to recognize and use the concepts in real-world situations as well as when, where, or how to employ methods and ideas.

Level 4: Analyze

Analysis means breaking a topic or idea into components or examining a subject from different perspectives. It helps us see how the “whole” is created from the “parts.” Analysis helps reveal the connections between facts.

Level 5: Synthesize

Synthesizing means considering individual elements together for the purpose of drawing conclusions, identifying themes, or determining common elements. Here you want to shift from “parts” to “whole.”

Level 6: Evaluate

Evaluating means making judgments about something based on criteria and standards. This requires checking and critiquing an argument or concept to form an opinion about its value. Often there is not a clear or correct answer to this type of question. Rather, it’s about making a judgment and supporting it with reasons and evidence.

Level 7: Create

Creating involves putting elements together to form a coherent or functional whole. Creating includes reorganizing elements into a new pattern or structure through planning. This is the highest and most advanced level of Bloom’s Taxonomy.*


Anderson, L. W., Krathwohl, D.R., Airasian, P.W., Cruikshank, K.A., Mayer, R.E., Pintrich, P.R., Wittrock, M.C (2001). A taxonomy of learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York, NY: Longman.

Share this post

You might be interested in...

Justifying e-learning to faculty

Numerous articles, studies, and reports point to the remarkable growth of online education and the difficulty of justifying eLearning to faculty members. While the two articles I selected confirm this resistance, a quick Google search illustrates just how pervasive faculty resistance is toward online education. According to the Babson Survey Research Group report (Allen and Seaman, 2017), there were 6.3 million students (1 in 4 students) enrolled in online undergraduate courses in 2016. Given the demand for online classes, it is imperative that higher education leadership and instructional designers employ strategies to justify eLearning and overcome faculty resistance.

According to a 2016 Wiley Education article, 58% of faculty members are skeptical of eLearning based on the belief that online courses produce results that are inferior to traditional face-to-face instruction. Overcoming this fear involves not only presenting the data that shows the effectiveness of online courses but also in providing the faculty members information related to their institution’s online courses. General statistical information is essential but seeing the numbers that represent their school gives the instructors a personal connection to the evidence. The Babson survey confirms this stating that “the academic leaders who had favorable opinions of online learning had some direct exposure to the medium” (Allen and Seaman, 2017). Given the information in this article, the Babson survey, and my own experience with faculty, I agree that showing faculty members that eLearning is effective in meeting learning objectives is crucial to justify online education.

In a 2018 Inside Higher Ed article (Liberman, 2018), several online learning leaders discussed ways to justify eLearning to resistant faculty. Paul Krause, CEO of Cornell University’s online learning department, emphasized a crucial element that is often overlooked in this conversation. It is important to begin with an understanding that many college educators have either limited experience with online learning or only programs that were poorly designed. Faculty perceptions that are shaped by negative information and experiences must be addressed. Exposing the instructor to high-quality courses, institutional standards of excellence in online education, and showing your own commitment to creating significant learning in an eLearning environment are essential to justify online classes for faculty.

Each article shows how crucial it is that we are able to justify eLearning. As I read each article, I noticed one element missing from the conversation: empathy. Taking an empathetic approach produces more significant results than working from your own presuppositions about those who resist eLearning. Recognizing their contributions, expertise, and value show the instructor that you respect them. Faculty are just like everyone else; change is scary. They have strong emotional ties to traditional instruction that may supersede their ability to view change logically. Their technological skills may not be up to par, and they fear failing as an online educator.

As an ID, I think that practicing what we preach is a critical aspect of justifying eLearning to resistant faculty. Course design begins with learner analysis to understand the learner—their prior experience, values, beliefs, and attitudes. It is conducted through an objective lens with the goal of designing learning that meets the learner where they are. Similarly, performing a faculty analysis provides insight into the reasons behind their resistance. It removes the biases we may have and allows us to see the individual in the light of empathy and compassion. Like any change management attempt, building rapport, establishing trust, and listening to the faculty member can bring down the walls preventing communication. Once this occurs, framing the conversation for that individual in a way that validates their experience while appealing to their passion for education has the power to create acceptance for change.

Building Trust: How to Address Faculty Concerns about Online Education. 2016.Wiley Education.

Liberman, Mark. 2018. Overcoming Faculty Resistance – Or Not. Inside HigherEd.

Allen, Elaine and Jeff Seaman. 2017. Digital Learning Compass: Distance Education Enrollment Report. Babson Survey Research Group.

Image Source: iStock


Share this post

You might be interested in...

Assessment Strategies

Understanding the connection between formative and summative assessments, and the how formative assessments should help the learner do well on the summative assessment, is vital for an instructional designer.

In the video shown below, the author shares an easy-to-understand description of both formative and summative assessments. From our previous coursework, and my job as an ID at Samford, I am familiar with and exposed to assessments regularly. But, I have not considered how the two types work in tandem to determine the extent students are successfully meeting the course objectives.

Understanding that formative assessments should work as a checkpoint for both the learner and the instructor was a “light-bulb” moment for me. I recognized the value of feedback for the learner, but had not considered that a formative assessments offers critical information and feedback for the instructor as well. As the instructor stops at the formative assessment checkpoint, they have an opportunity to evaluate their own instructional strategy and determine whether it is effective in achieving the learning objectives. From the learner and instructor perspective, formative assessments are essential to creating significant learning.

Before I began the IDTE program, my educational experience included mostly summative assessments. Now, I realize how valuable quality formative assessments are in learning. I can point to several undergrad courses where the feedback from a few formative assessments would have not only made my work a little easier, but I would have learned so much more.

I think that having a specific checkpoint where I can stop, receive expert feedback from my trusty guide, then make adjustments based on that feedback have made my IDTE learning journey much richer and more satisfying. Plus, it has helped me embed the learning in my mind. The quest for learning requires both types of assessment for the learner to meet the objectives successfully.

Source: Gary Wright

Share this post

You might be interested in...

Chunky Learning

Content chunking involves organizing information in “chunks” so that it’s easier for learners to digest. Instead of memorizing multiple concepts, online learners are able to analyze each concept thoroughly and absorb the content, one bite at a time. Once they’ve assimilated the content, they move onto the next concept.

Chunking stems from the field of cognitive psychology. Science indicates that our working memory can only hold a finite number of items, and when it reaches full capacity, we experience cognitive overload.

However, organizing the information into chunks takes stress off the mental pathways, making it easier to remember the learning concepts. Chunking learning content allows the instructor to define an objective and then organize the content, activities, assessments, and other learning tools in segments that support the objective.

How can an instructional designer apply chunking in course design?

1. Make sure the course begins with specific goals and objectives. Well-designed objectives are the keystone of each chunk. Canvas makes this very easy with their Module-based design. 

2. Stay within the limits of cognitive capacity. According to science, our brain can only process 3-5 pieces of information at a time. It’s like memorizing a telephone number. Our mind can remember 205-251-1587 easier than 2052511587. 

3. Create a content map during the design phase. This allows you to categorize your content and break it down into relevant modules. Instructional designers create a list of the desired outcomes or goals, then list 3-to-5 related concepts for each. Each concept should tie into the learning objective. The next step is gathering all relevant assets, including eLearning activities, online assessments, and multimedia. Finally, organize the content by how they fit into the big picture and support the learning outcomes.

As I mentioned earlier, the Module concept is one of the great features within Canvas. (Northwestern University’s sample course is an excellent example of chunking.) Instructors can organize content based on the module goals and objectives, students are able to locate content with ease, and it helps the student see how the connection between concepts.
In my experience as an online learner, some classes really stimulated my mind and I retained the information. I also had classes that were nothing more than memorizing information for a test. It’s no surprise that the former course design was based on chunking, and the latter was more like ‘dump truck design’. (Yes. I just invented that phrase.) Essentially, the instructor backed up the truck and dumped all the information on the students. No discussions. No relevant structure. Just information in large quantities.
If you want to learn more about chunking, check out the links below.

You can read more and view examples of designing using content chunking on the Nielsen Norman Research Group site.

George A. Miller’s Information Processing Theory

Share this post

You might be interested in...

Generation Z

Analyzing Gen Z

Addressing generational differences in course design is an important, yet challenging task for instructional designers. In theory, design for younger learners is less complex than design for mixed-age adult learners. Much has been written comparing Baby Boomers to Millennials; however, today’s secondary schools are not filled with millennials. These students are from Generation Z (also known as post-millennials, iGeneration, Homeland generation).

Both my sons are millennials and, for the most part, exhibit the classic learner traits of their generation. However, millennials from the last generational year (1994) graduated high school in 2006. Generation Z began in 1995 and continues through our current time. This means that the majority of secondary school students are vastly different than children in the millennial generation.

What does Gen Z look like?

1. Gen Z is less focused with shorter attention spans. Their daily life is a life of fast-paced distractions that compete for educational commitment and time. Smart devices, Snapchat, FaceTime, and many similar apps are now the norm for even the youngest children.

2. They place less importance on education. Gen Z believes that they are capable of learning things for themselves. They prefer learning that does not force them to conform to what they perceive as useless knowledge that does not apply to their chosen profession.

3. They are more entrepreneurial. Inspired by forward thinking companies like Google, Tom’s Shoes, and Apple, it’s no surprise that 72% of Gen Z high school students plan to start their own business (Kadakia, 2015) following high school.

4. Gen Z is the iGeneration. They were born into a highly connected, social environment. It is reported that 92% of Gen Z children have a digital footprint (Spencer, 2018). The average age of significant exposure to technology is now 2-years-old (Spencer, 2018)!

5. They are disruptive. This is a no-brainer since they consider formal education inferior to their own desire and capacity for self-education.

6. They are true digital natives and dependent on technology. As a result, they encounter parents, teachers, and society scolding their “technology addiction”. This, in turn, makes a Gen Z’er intolerant of less technologically advanced generations.

gen z infographic
Source: The Herb Kelleher Center @UTAustin

So, what does this have to do with instructional design?

So, what does this have to do with instructional design? It illustrates the importance of understanding age-related differences as well as the need for accurate learner analysis in course design. Consider my own preferences as a Gen Xer: a course that consists of readings from a text book, exams, and writing a paper is often an exercise in endurance. However, if the course incorporates current information in the context of real-world opportunities to apply the knowledge and skills, then I’m all in. I’ll devote hours and hours to researching and developing the assignment. However, someone from Gen Z may hate this course design model that fits my generational preference, and possibly resist or withdraw completely. Generation Z, especially the child learners, want fast-paced, technologically advanced learning elements. Things like augmented reality, gamification, mobile learning, virtual labs, social learning, and micro-learning are a few examples of design elements that appeal to Gen Z.

The bottom line is that, as an instructional designer, I can not neglect age-related issues within learner analysis. The differences between generations are astounding and, if ignored, often result in lower engagement, satisfaction, and learning.

Crystal Kadakia’s TedX talk (below) is an excellent explanation of the differences between Millennials and Gen Z.


Oblinger, D.G., & Oblinger, J.L. (Eds.) (2005). Educating the net generation. Boulder, CO: EDUCAUSE.

Spencer, B. (2018). Digital Literacy: The Quest to Become Digitally Literate

Share this post

You might be interested in...