Thing 22: Online Teaching

On Friday, my colleagues and I shared what we learned from the Things in the Pedagogy track of the 23 Framework Things. I was assigned Thing 22: Online Teaching. I selected to complete Option 3, though I didn’t do the activity:

Post a brief comment below describing the outline for an online learning object (lesson) using the steps in the book to guide you. What part of the Framework will you focus on? Create an outcome statement, and select one of the common instructional design program activities (p.29) to assess the student’s competency.

However, I do think that I’d be interested in developing something that helps students learn how to approach selecting a database. I imagine including research problem scenarios in which students would need to match up the problem to an appropriate database based on the description. In the notes I posted to my colleagues (see below), I refer to this briefly as we are working on developing content for a new GE course.

Here are my notes.

This module was presented by the steering committee of the New Literacies Alliance, which is a group of librarians from a variety of institutions working to design a common research instruction curriculum based on the ACRL Framework. The lessons they have created tie to particular knowledge practices and dispositions and are licensed under Creative Commons. Many appear in the ACRL Framework for Information Literacy Sandbox. If you have looked at the Sandbox, many of the SoftChalk online modules, such as the Citations tutorial, were designed by librarians involved with the NLA.

For this module, I read Chapter 3 and Appendix E of Creating and Sharing Online Library Instruction: A How to Do it Manual for Librarians (2017) written by three NLA librarians, Joelle Pitts, Sara K. Kearns, and Heather Collins. The chapter outlines how to create learning objects using McTighe’s and Wiggins’ backward design curriculum planning model.

  1. Identify desired results.
    • What should students be able to do at the end of the instruction?
      • Select components of the Framework to teach.
  2. Determine assessment evidence.
    • How will we know if students have achieved the desired result?
      • Choose a Bloom’s Taxonomy level and verb
      • Outline an activity the students will complete to demonstrate desired results
      • Write a learning outcome.
  3. Plan learning experiences and instruction.
    • How can we support learners as they come to understand important ideas and processes?
      • Create redundant digital learning objects to support the learning outcome.
      • Create assessment activity.

Identify Desired Results

  • Learning objects should be kept to 8-15 minutes.
  • The knowledge dispositions or practices you select will need to be modified because many of them are “too big” to cover in one object.
  • Highlight one major frame in the outcome, even though there may be practices from different though related frames at play.
  • Choose a level of expertise [novice, beginner, competent, proficient, expert (Dreyfus & Dreyfus, 1980)].

Determine Assessment Evidence

  • Choose a Bloom’s Taxonomy level and verb
    • “The higher the Bloom’s Taxonomy level, the more difficult it is to design online learning objects and activities, especially if automated grading is desired” (p. 26).
      • This makes me feel a lot better about what can be achieved for modules we develop that are intended for instructors to assign to their students (WRI 01); these would be good for more concrete skills, such as selecting an appropriate/relevant database, etc. It does make me think about the SPRK courses, as well, mostly because two out of my three areas involve databases.
    • Write an outcome
      • The student will + Bloom’s Taxonomy verb + evidence + in order to + desired results = outcome
      • Bloom’s Taxonomy list on p. 27
      • Learning outcome formula checklist on p. 28
      • Common types of instructional design program activities on p. 29

 Plan Learning Experiences

  • NLA has a storyboard template to serve as a guide for developing online learning objects (see Appendix D in the book as this was not included in the PDFs)
    • Introduction, background info
    • Relevancy to students’ lives
    • State the problem and possible solutions
    • Lesson climax activity
    • Assessment
  • Have a peer review your learning object (see the Learning Object Rubric, Appendix E, p. 119)

Evaluating Infographics

I subscribe to communications from the Online Learning Consortium, and a couple of weeks ago, they sent out an infographic about the state of online education. Since I’m interested in online learning (I did my MLIS online, and I have taken a class on teaching online), I took a look at it, and I was surprised that the infographic indicated that 75 percent of undergraduates are age 25 or older. Right now I work at a community college library in Central California, and we have a ton of nontraditional students, but the number of students age 25 and older is 35.6 percent; statewide, the number of community college students who are age 25 or older is 42.9 percent. The 75 percent figure that all undergraduates in the country are nontraditional as claimed by OLC seemed wrong to me. 75 percent?! [Although, I did discover that, according to Choy (2002), if a more broad definition of nontraditional is used, this figure is estimated at 73 percent.]

I seem to be helping a lot of students with fact-checking specific statistics lately.  Thankfully, I can point students to resources like the National Center of Education Statistics (NCES) data, but statistics aren’t easy to look through or interpret, demonstrated by my experience analyzing the infographic.

OLC cited sources at the very bottom of the infographic, but it’s not clear which source goes to which fact. I dug into every single link to try to figure out where this 75 percent thing came from, but I was a little overwhelmed because I am not drawn to charts, lines, and numbers (data scientists and data science/statistics librarians, I bow down). I also recruited the librarians at the other campus to help me, and one of them wrote back to me that they had over-simplified the information as the education statistics are divided by type of college. Here’s what the National Center for Education Statistics’ Characteristics of Postsecondary Students information actually says:

In 2013, a higher percentage of full-time undergraduate students at public and private nonprofit 4-year institutions were young adults (i.e., under the age of 25) than at comparable 2-year institutions. At public and private nonprofit 4-year institutions, most of the full-time undergraduates (88 and 86 percent, respectively) were young adults. At private for-profit 4-year institutions, however, just 30 percent of full-time students were young adults (39 percent were ages 25–34, and 31 percent were age 35 and older).

Not cool OLC.

Evaluating, analyzing, and interpreting information, whether in text, numbers, or images is such an important skill, not just for school purposes; it’s a life skill. One of my good friends who teaches English shared Sheida White’s article “Seven Sets of Evidence-Based Skills for Successful Literacy Performance” (2011) from the now defunct Adult Basic Education and Literacy Journal. In the article, which is based on her book Understanding Adult Functional Literacy: Connecting Text Features, Task Demands, and Respondent Skills (2011), she lists seven skills that are needed for “adolescents and adults to meet the literacy demands of education, work, citizenship, and daily life,” which include text search skills, inferential skills, language comprehension skills, basic reading skills, computation identification skills, computation performance skills, and application skills (p. 40). White writes:

…[S]econdary, post-secondary, and adult education programs typically do not provide explicit classroom instruction in the quantitative literacy skills needed to work with numbers embedded in prose and document texts. In fact, mathematical information is often stripped away from any surrounding authentic texts in schools to produce a cleaner measure of students’ skills in mathematics as a separate domain. This approach, does not reflect the way adolescents and adults typically encounter quantitative problems in their daily lives, including workplaces. (p. 47)

This article changed the way my friend taught her courses. Like many English and communication teachers, she has an assignment where she has students evaluate an advertisement for modes of persuasion, but she started adding in-class assignments where students had to breakdown a passage with numbers to build their own chart. She also has them analyze charts and write down what they think the chart is showing. This was a hard task for some of her lower level students. She and I dreamed of creating a learning community between English, math, and the library resources class (I have never taught it, and we were planning to offer it in Spring 2017, but I’m leaving) to work on some of these and other literacies. (See Jacobson and Mackey’s presentation from ACRL 2013 on metaliteracy and the Metaliteracy blog).

I often think about the assignments I might give if I taught information literacy in a credit class environment. I love the idea of evaluating an infographic or looking at and interpreting a chart. So far, Project CORA doesn’t have an assignment on evaluating infographics but rather has an assignment on designing infographics, but I will do a little more digging elsewhere later. Brain Pickings recently had an article about the new book Jane Jacobs: The Last Interview and Other Conversations, where Jacobs is quoted as saying:

If I were running a school, I’d have one standing assignment that would begin in the first grade and go on all through school, every week: that each child should bring in something said by an authority — it could be by the teacher, or something they see in print, but something that they don’t agree with — and refute it.

I think with some modification a weekly statistics-checking exercise done in PolitiFact (the editor has a Masters in journalism and a Masters in Library and Information Science) fashion might be fun. I know the perfect infographic to start with. 😉

Choy, S. (2002). Nontraditional undergraduates. Retrieved from http://nces.ed.gov/pubs2002/2002012.pdf

Jacobson, T.E., & Mackey, T. (2013). What’s in a name? Information literacy, metaliteracy, or transliteracy? [SlideShare slides]. Retrieved from http://www.slideshare.net/tmackey/acrl-2013

National Center of Education Statistics. (2015, May). The condition of education: Characteristics of postsecondary students. Retrieved from https://nces.ed.gov/programs/coe/indicator_csb.asp

Online Learning Consortium. (2016). 2016 higher education online learning landscape. Retrieved from http://info2.onlinelearningconsortium.org/rs/897-CSM-305/images/OLC2016ONLINELEARNINGIMPERATIVEINFOGRAPHIC.pdf

Popova, M. (2016, May 4). Urbanism patron saint Jane Jacobs on our civic duty in cultivating cities that foster a creative life [Weblog]. Retrieved from https://www.brainpickings.org/2016/05/04/jane-jacobs-last-interview/

White, S. (2011). Seven sets of evidence-based skills for successful literacy performance. Adult Basic Education and Literacy Journal, 5(1), 38-48. Retrieved from http://eric.ed.gov/?id=EJ918178