You got research in my writing class - Making a Difference

Teaching information literacy and writing studies - Grace Veach 2018

You got research in my writing class
Making a Difference

This chapter reflects upon a faculty-librarian partnership, guided by similarities between the ACRL Framework for Information Literacy (2015) and the WPA Framework for Success in Postsecondary Writing (2011), designed to teach information literacy skills alongside writing skills at a four-year university. It details a classroom-based embedded librarian model created to encourage knowledge transfer and support student research in first-year composition classes, as an alternative to one-shot library instruction. This profile of our program design is meant as a model for others interested in building a similar collaboration. Assessment data, collected via the AAC&U Information Literacy VALUE Rubric (2013) and surveys, compares information literacy learning in the embedded program to information literacy learning from a single session with a librarian. Assessment data shows that students do benefit from the embedded program, though they tend to self-report stronger information literacy skills than faculty members judge them to possess.

INTRODUCTION

Information Literacy Necessitates Embedded Librarianship

Librarians have many concerns about the Framework for Information Literacy, namely whether it’s teachable, assessable, or even if it should have fully replaced the Information Literacy Competency Standards for Higher Education. Most of the controversy has to do with the introduction of threshold concepts, the “light bulb” moments we have about information and research that alter our searching, evaluating, and citing practices (e.g., Research as Inquiry). Such moments are difficult to teach and measure, even over several classes. But these threshold concepts articulate the complex challenges with which librarians grapple: namely, that information literacy is the possession of many discrete skills and attitudes, and cannot be taught, wholesale, during a single-session (or “one-shot”) library class.

Information literacy, as a central value in academic life, is typically taught through the study of another content area. In order for it to make sense, there must be some common information about which to become literate; otherwise, the tools of information literacy have no application. A course on information literacy alone will not necessarily result in more savvy and responsible student research. One of the most common disciplines to partner with librarians teaching information literacy is composition studies; as students learn writing processes and skills, research strategies can be taught as part of a responsible and responsive writing process.

A perusal of commonly assigned textbooks in first-year writing classes (The Allyn and Bacon Guide to Writing [Ramage, Bean, & Johnson, 2009], Everything’s an Argument [Lunsford, Ruskiewicz, & Walters, 2016], The Norton Field Guide to Writing [Bullock, 2013], The St. Martin’s Guide to Writing [Axelrod & Cooper, 1991], They Say/I Say [Graff, Birkenstein, & Durst, 2009], The Prentice Hall Guide for College Writers [Reid, 2011]) shows significant chunks of text provide guidance on research. All have at least one chapter devoted to some aspect of research, whether on finding sources, evaluating sources, integrating sources into an argument, avoiding plagiarism, or citation. Many have more than one such chapter, and the Norton Field Guide has nine.

Embedded librarianship, which we used in the composition courses we detail in this chapter, functions as a middle ground through which students can be introduced to the Framework. The name is borrowed from embedded journalism, in which “journalists become a part of [a] military unit, providing a perspective, ’a slice of the war’ from their vantage point” (Schulte, 2012). In the case of information literacy, an embedded librarian observes to understand the material being covered in a classroom, as well as the concerns of the professor and the students. The embedded librarian can integrate information literacy skills and threshold concepts, as appropriate, to shed new light on the material being covered.

A Tale of Two Frameworks

In order for an embedded program to be of real help, instead of intrusive or disruptive, the goals of the librarian teaching information literacy should match those objectives outlined for the course. Luckily, the Framework for Information Literacy and the Council of Writing Program Administrators’ (WPA) Framework for Success in Postsecondary Writing have a lot in common. The Framework for Information Literacy challenges us to teach Research as Inquiry, that Authority Is Constructed and Contextual, and to frame Searching as Strategic Exploration (Association of College and Research Libraries Board, 2015). The WPA Framework urges the development of “critical thinking through writing, reading, and research,” challenging students to generate questions, conduct research, and evaluate sources. Source evaluation shows up again in the WPA Framework under “Develop rhetorical knowledge,” drawing attention to the ideas of audience, purpose, and context—information literacy concerns surrounding the research process (Council of Writing Program Administrators, 2011). As evidenced from the national disciplinary conversations about composition and about information literacy, research has a strong presence in composition courses, which makes them a natural environment in which to cover information literacy concepts.

Of course, the Framework for Information Literacy aims higher than to simply impart skills for library use within composition. It “envisions information literacy as extending the arc of learning throughout students’ academic careers and as converging with other academic and social learning goals” (Association of College and Research Libraries Board, 2015). Under the threshold concept Information Creation as a Process, one of the Knowledge Practices is to “transfer knowledge of capabilities and constraints to new types of information products.” Instead of modeling simple, linear behaviors for students to emulate in the library, this model encourages librarians to focus their students on meta-cognition and metaliteracy, to examine how they get information in their lives, how they evaluate its credibility, and how it may be of use to them. And metacognition is a shared focus in composition studies. Many teacher-scholars (Gorzelsky, Driscoll, Pazcek, Hayes, & Jones, 2016; Smit, 2004; Yancey, Robertson, & Taczak, 2014) similarly emphasize that students need to articulate their choices and processes if they hope to generalize their knowledge to other tasks and situations. Strengthening these understandings is useful in their academic, political, and personal lives.

Assessing Information Literacy: A New Challenge

The history of measuring information literacy learning in higher education is brief. The original Information Literacy Competency Standards for Higher Education (approved and introduced by the ACRL in 2000, and rescinded in 2016) were presented as a set of skills that were more easily measured. For example, students’ abilities to properly cite sources in a given writing style has been achieved to some extent with standardized tests, such as Kent State’s Tools for Real-time Assessment of Information Literacy Skills (TRAILS) open-access test (http://trails-9.org/index.php?page=home) and the Project SAILS (https://www.projectsails.org/) proprietary model test. Both standardized tests measure student information literacy levels pre- and postinstruction for benchmark and conclusive data sets. However, Carrick Enterprises has now assumed control of these information literacy tests and will soon launch the Threshold Achievement Test for Information Literacy (TATIL), a standardized test to measure students’ comprehension and attainment of the Information Literacy Threshold Concepts (Radcliff, Cunningham, Hannon, Harmes, & Wiggins, 2018).

While these tests do provide methods for efficiently capturing data on student research, they also add to the burden of cost (to either the school or the students) as well as that of time management on all stakeholders’ parts: the librarian who must encapsulate as much content as possible in an hour; the instructor, who must carve out time for research instruction; as well as the students, who tend to suffer from test fatigue. In Janine Lockhart’s study of online information literacy skills assessment instruments (2014), standardized tests using multiple-choice questions have been critiqued by Megan Oakleaf as not testing “higher-level thinking skills” and lacking “the ability to assess students’ authenticity” in the application of information literacy skills (p. 37). It is this specific issue of assessing authentic learning pertaining to the abstract concepts of information literacy threshold concepts that informed our decision to assess student writing from our embedded librarian program using the AAC&U’s Information Literacy VALUE Rubric. The VALUE Rubric requires scorers to assess discrete skills, such as citation, but importantly locates these skills within the context of the student’s entire paper (Association of American Colleges and Universities, 2013). In other words, it is a more contextual measure of information literacy applied in a paper than are the skills-based tests like TRAILS and SAILS.

Assessing students’ information literacy, much less their ability to generalize their knowledge about research, is difficult and complex. As intimidating as the Framework for Information Literacy has been to understand and apply in the classroom, assessing a curriculum based on threshold concepts has proven to be even more daunting. First, faculty members often struggle with the burden of assessing programmatic curricula due to time constraints and perhaps some anxiety about the assessment of their curricula (and therefore their instructional capabilities). Second, the process of quantifying students’ understanding of the more difficult abstract concepts, such as synthesizing the information from sources to support their thesis or accessing needed information, has yet to be fully developed. Two of the most significant barriers in measuring the effectiveness of any information literacy curriculum are the process-oriented nature and abstract quality of threshold concepts. Given that only one threshold concept could be taught or grasped within a one-hour period, capturing the full spectrum of information literacy learning outcomes is particularly challenging.

In response to these challenges, we developed an embedded librarian model that afforded us the opportunity to teach multiple threshold concepts over the course of three to five classes. Our design is by no means perfect; we expect it to evolve with our evaluations of our experiences and the assessment data. We share our structure and experience in this chapter to inspire similar partnerships and invite improvements of our program.

DESIGNING AN EMBEDDED LIBRARIAN PROGRAM IN FIRST-YEAR WRITING CLASSES

At our institution, a regional, comprehensive public university, the instruction librarians and the director of composition designed a new voluntary program for first-year composition instructors to include embedded librarians in their classes. Each librarian met with her respective composition instructor(s) to develop a plan of action for embedding information literacy into the composition curriculum. These meetings included discussion of student needs, syllabi, and possible research assignments, and librarians and instructors met several times and discussed the students’ needs, the syllabus, assignments, and class schedules.

Rather than have librarians embedded throughout the entire semester, we concentrated librarian support on research assignments, scaffolding lessons as steps in the research process and incorporating relevant threshold concepts. These steps included shorter assignments on discrete skills, such as searching for scholarly journal articles and evaluating sources for annotated bibliographies. For evaluation we used the CRAAP test (Meriam Library, 2010) and for determining how to use information we incorporated BEAM (Bizup, 2008). The 3—5 embedded librarian class sessions included active learning assignments in class, as well as practical homework assignments.

The focus of the embedded librarian support differed slightly for each instructor. While all classes focused on the information literacy skills listed below, each class benefited from its own tailored lessons. In short, we designed flexibility into our embedded librarian program to respond to unique student needs and instructor priorities. For example, depending on the instructor, some students had flexibility in choosing research topics, and some did not. One class was limited to topics dealing with immigration in the United States.

We began by having the students search for background information on their topics using online encyclopedia databases or newspaper databases so they could become more familiar with their topics. After that, we moved on to performing effective search strategies using many of the library’s online resources, since many underclassmen are not familiar with databases and think they function similarly to search engines. Other instruction sessions included evaluating and learning the difference between primary and secondary sources, and between popular and scholarly sources; students reviewed examples of different sources and evaluated them in large-group discussions. We also provided specific instruction in evaluating websites. Finally, we taught students about citing sources using MLA style, focusing on distinguishing between periodical and book citations, and becoming familiar enough with correct MLA citations so that they did not have to rely on citation generators.

In one first-year writing section that allowed students to choose their own topics for their research papers, the embedded librarian designed a lesson on brainstorming research topics. She showed the students an advertisement for a hair product and used that to inspire topic ideas, such as portrayal of women in fashion advertisements; feminism in advertising; women and body image and self-esteem. Using this advertisement as a shared example for the class, students searched for articles in the databases while the embedded librarian encouraged them to notice the database subject terms and differences between popular and scholarly sources. It is important for students to not only learn the skills and concepts in these sessions, but to be able to apply them to their final papers and hopefully to other classes with similar assignments. (For an example of one librarian’s embedded lessons’ progression, see the Appendix.)

Because of our emphasis on building information-literate students, not just teaching discrete skills, we used a shared vocabulary of terms meaningful to both composition and information literacy. The terminology that was emphasized differed for each librarian-instructor partnership. For example, one of the composition instructors and her embedded librarian introduced students to the term exigency. This term was used in lessons to discuss the reasons authors have for writing and their resulting biases. When choosing a source for their research, students had to evaluate it, including the author’s exigency for writing, as well as their own. Finally, in a reflection following their research paper, students were asked to write about the concept of exigency and its role in their writing. Our goal in incorporating shared terminology was to reinforce threshold concepts in both fields and encourage metacognition about one’s research and writing process.

OUR ASSESSMENT DESIGN

Our assessment of information literacy learning as a result of the embedded librarian program was modeled on both the Multi-State Collaborative to Advance Learning Outcomes Assessment (MSC) program (http://www.sheeo.org/projects/) and best practices for composition assessment. The process used in this collaborative model collects data from direct measures (scoring of student essays from composition classes), and we added an indirect measure of student information literacy learning (student surveys). The direct assessment of student writing measurably demonstrates the degree to which students transfer learned concepts of information literacy to their academic research and writing. The measurement of student learning outcomes through these methods provides better contextual evidence of information literacy than standardized tests have offered.

All student research papers were de-identified for blind assessment by a panel of first-year composition instructors and embedded librarians. We read and scored a total of 45 student papers: five randomly selected student papers each from six embedded classes, and three nonembedded classes as a control group, from two different semesters. We used the AAC&U Information Literacy VALUE Rubric (Association of American Colleges and Universities, 2013), but we did not score for students’ ability to “Access Needed Information” because this criterion was not clearly demonstrated within the students’ papers. This rubric focused our scores on the following aspects of a student’s paper: (1) the ability to determine the extent of information needed by defining the scope of the research question/thesis/key concepts and using types of sources that relate to the paper’s focus appropriately; (2) the ability to evaluate information and its sources critically by using a variety of relevant sources with an awareness of audience and the bias/point of view of the source; (3) the ability to use information effectively and to accomplish a specific purpose; and (4) the ability to access and use information ethically and legally through proper citation and attribution.

In addition to directly assessing the information literacy skills demonstrated in student writing from different first-year writing courses, we also used a survey to ask students for self-reported growth and metacognitive reflection on their learning. This survey was only given to students in classes with embedded librarian support, so we do not yet have comparative data from nonembedded classes. Our surveys utilized a Likert scale with scores from 1 to 5 for each question, and asked students to rate their ability to complete a variety of information literacy activities as a result of their first-year writing course with embedded librarian support. The data from this pre-instruction exercise will be assessed in the semester following our writing of this chapter and will be used to inform our instructors’ lesson plans and research assignments.

We have not yet reached a conclusion as to how much overlap of curricular language, theoretical applications, and assessment methods occurs between introductory college writing and information literacy practices. Both librarians and composition faculty were encouraged by the similarities we discovered across the disciplines and intrigued by the differences; the differences suggest that there is still more to learn from one another, and the similarities confirm a shared academic goal that makes this partnership—and possible collaborations with other academic departments—promising.

FINDINGS: DIRECT AND INDIRECT MEASURES OF STUDENTS’ INFORMATION LITERACY

Direct Assessment: Student Research Papers

The scores in all four categories of the AAC&U Information Literacy VALUE Rubric were slightly higher, approximately .3, for students in classes with embedded librarian support (see Table 20.1). Students in both embedded and nonembedded composition classes scored the highest in their ability to determine the extent of information needed, and they scored the lowest in the category of accessing and using information ethically and legally. Because ethically managing sources requires both a conceptual understanding of academic values and a mastery of distinct skills (summarizing, paraphrasing, quoting, choosing signal phrases, adhering to citation styles, etc.), this collection of skills provided particularly complex and challenging for students—and the rubrics sets the bar high for excellence in this category. As Table 20.1 shows, our first-year students are near the 2-range across categories of the AAC&U Information Literacy VALUE Rubric, which provides a sense of how much development is needed by the time they graduate. (When students graduate, they should be demonstrating information literacy in their writing at the 4-level of the rubric.)

TABLE 20.1 Aggregate Scores of Information Literacy in Student Writing Using the AAC&U Information Literacy VALUE Rubric


Determine the Extent of Information Needed

Evaluate Information and Its Sources Critically

Use Information Effectively to Accomplish a Specific Purpose

Access and Use Information Ethically and Legally

Average Score

Embedded Librarian Classes

2.43

2.08

2.12

1.86

2.11

Nonembedded Librarian Classes

2.13

1.83

2.07

1.43

1.80

Indirect Assessment: Student Surveys

Table 20.2 shows the distinct skills that students were asked to self-assess and the average scores they reported on the surveys. The survey asked students to rate their ability in each discrete skill/concept on a 5-point Likert scale, 5 being the highest score, as a result of their composition course with embedded librarian support.

TABLE 20.2 Aggregate Scores for Student Self-Reported Information Literacy Abilities

As a result of your composition class with embedded librarian support, how well are you able to do the following, on a scale of 1—5?

Average Score

Use the library to search for a range of popular and scholarly sources?

4.03

Understand the difference between popular and scholarly sources as we discussed them in class?

4.51

Understand the difference between databases, journals, and articles?

4.14

Evaluate the credibility of a source?

4.16

Evaluate the usefulness of a source?

4.26

Put multiple sources and your own perspective “into conversation” in your writing?

4.04

Use MLA style for in-text citations?

4.17

Create a Works Cited page using MLA style?

4.46

Use online citation tools (RefWorks, EasyBib, etc.) correctly?

4.34

Find books using the library’s classification system?

3.20

Find materials in the library as a result of the tour?

3.18

Overall, how useful were the classes held in the library for your work on your research paper?

3.92

How useful do you believe the classes held in the library will be for your future classes?

3.89

Has your confidence increased for seeking out help with future research projects?

3.82

The first five survey questions prompt students to reflect on their skills related to searching for and evaluating different types of sources. Our curriculum emphasizes the difference between popular and scholarly sources, and we encourage students to use a wide range of sources responsibly and for different purposes. Students noted that they understood the difference between types of sources more than they were able to utilize search strategies, distinguish between databases/journal/articles, or evaluate the credibility and usefulness of a source.

The next six survey questions ask about students’ ability to synthesize sources within their writing, document their sources, and make use of the library to locate sources. Students highly rated their ability to document sources and use tools to do so. They rate their ability to use the library lower than other categories, but we believe this is due to a few factors: (1) the classification system was not emphasized throughout the sessions as other skills were; (2) students used online sources more than print sources; (3) not all classes had a tour of the library, so some should have responded N/A, but instead rated this lower, skewing the scores. In any case, we do believe that these basic library use skills are weak for students, at least relative to their other information literacy skills.

The final three questions ask for students’ responses on their general learning and impressions of the library support in first-year writing. Because we are interested in preparing students to be information literate across courses and we know they need to transfer their knowledge from first-year writing to other contexts, we asked how useful students believe their learning would be for the future. The scores were just under 4, indicating that students believe, though not as strongly as they might, that the library support will help them in their future endeavors as students.

Discussion

Though the AAC&U rubric cannot be compared apples-to-apples to the Likert survey (the former uses a scale of 0—4, while the latter uses 1—5, and they ask for slightly different skills), it appears that students have a higher estimation of their information literacy skills than the faculty scorers do. This may indicate that students simply don’t know what they don’t know, but it also may be their attempt to represent their significant learning over the course of the semester.

In our direct assessment of student writing we scored students’ cognition, or the thinking involved in completing a task. We can judge this by the task itself. However, in this indirect assessment of students’ self-reported skills, we also attempted to access their metacognition, or their reflection on what they know and the efficacy/outcomes of their choices. These definitions come from Gwen Gorzelsky and her co-authors who are undertaking a longitudinal study, the Writing Transfer Project, on students’ ability to transfer knowledge about writing. One of the key points that Gorzelsky and her team make, as do others in composition studies who are interested in knowledge transfer, is that students are more likely to transfer skills when they can reflect on them and articulate them. We feel that this survey is a good start for getting students to articulate what they can do and what they know.

CONCLUSION

We see two concrete outcomes from this project. First, it helps our composition faculty to see where our students are and to view their information literacy learning as a journey that spans their entire lives. When we know that students are ending just above a 2 (the lower milestone level of the AAC&U rubric), we know that our goals for student achievement should be around that level instead of at the 4-level. If we see information literacy as a spectrum and consider students within zones of proximal development (Vygotsky, 1978), then we can present discrete research skills as not correct or incorrect, but as moving toward being a more responsible information user and creator.

Second, we learned that our students’ weakest area in their writing was “Accessing and Using Information Ethically and Legally,” and we learned that we can devote more in-class support to this. Most of our instruction on these skills comes early in the writing process, and it gets revisited as an issue of correct adherence to conventions. Once students have the other parts of their papers constructed, we should dedicate more class time to discuss the choices writers make about how they use information.

As our embedded librarian in composition program continues, we strive to improve our assessment design and support for composition instructors. One of our most recent developments is the addition of a pretest and posttest of students’ information literacy skills, so that we can better gauge student development over the course of a semester. We have also used our assessment data as a feedback loop to inform lesson plans for embedded librarian composition classes, as well as a LibGuide that contains information on particularly challenging aspects of information literacy, such as “Accessing and Using Information Ethically and Legally.”

It is our hope that this chapter inspires other partnerships like this one, especially those that use our experience to improve upon our model. We recommend that programs building a similar model of embedded librarians in composition courses consider the following: collecting open-ended survey responses from students about their learning; administering pretests and posttests, as well as survey responses, from one-shot classes and embedded librarian classes; and, if possible, conducting a longitudinal study of student writing and information literacy skills to better measure students’ knowledge transfer across classes.

APPENDIX. Example: Embedded Librarian Lessons Progression Over Three Composition Class Periods

1. The Research Process/Examining Source Types

Exercise 1: Rethinking the Research Process

On board: Some say happiness is about attitude; others say that happiness depends upon outside circumstances. Write a short paper on happiness and choice, using outside sources to support your argument.

“What would you have to do first, next, last?” Common response: (1) pick a side, (2) research, (3) write.

Point out that, following directions in this order, conclusion is drawn at step 1, and step 2 (the research) has no purpose. Introduce research as inquiry, instead, and flip steps 1 and 2.

Exercise 2: Examining Source Types

Provide a popular article to half the class, scholarly article to the other half, on the same topic; students read, pay special attention to: (1) what the article is about, (2) who the intended audience is, and (3) why this article was written (purpose).

On board, chart differences. Introduce evaluation (CRAAP method). Distinguish between “popular” and “scholarly” sources.

2. Library Resources/Searching and Finding

After library tour: what is a database? What is the difference between catalog and electronic databases?

Come up with keywords around a topic; run a search; use Boolean operators; access fulltext and citation information. Explain abstracts, interlibrary loan. Give students independent search time.

3. #subjectterms, Citations as Maps

Ask about hashtags—what happens if you click? misspell? They are unforgiving, but helpful. Relate to subject terms.

Demo a database search; pull subject terms from records. Check references for more related sources. Give resources for checking citations, and more independent search time.

REFERENCES

Association of American Colleges and Universities. (2013). Information literacy VALUE rubric. Retrieved from www.aacu.org/value/rubrics/information-literacy

Association of College and Research Libraries Board. (2015). Framework for information literacy for higher education. Retrieved from www.ala.org/acrl/standards/ilframework

Axelrod, R. B., & Cooper, C. R. (1991). St. Martin’s guide to writing (3rd ed.). New York, NY: St. Martin’s Press.

Bizup, J. (2008). BEAM: A rhetorical vocabulary for teaching research-based writing. Rhetoric Review, 27(1), 72—86.

Bullock, R. H. (2013). Norton Field Guide to writing (3rd ed.). New York, NY: W. W. Norton.

Council of Writing Program Administrators, et al. (2011). Framework for success in postsecondary writing. Retrieved from wpacouncil.org/framework

Gorzelsky, G., Driscoll, D. L., Pazcek, J., Hayes, C., & Jones, E. (2016). Cultivating constructive metacognition: A new taxonomy for writing studies. In C. Anson & J. Moore (Eds.), Critical transitions: Writing and the question of transfer (pp. 217—249). Fort Collins, CO: WAC Clearinghouse.

Graff, G., Birkenstein, C., & Durst, R. K. (2009). “They say/I say”: The moves that matter in academic writing. New York, NY: W. W. Norton.

Lockhart, J. (2014). Using item analysis to evaluate the validity and reliability of an existing online information literacy skills assessment instrument. South African Journal of Libraries & Information Science, 80(2), 36—45. https://doi.org/10.7553/80-2-1515

Lunsford, A. A., Ruskiewicz, J. J., & Walters, K. (2016). Everything’s an argument (7th ed.). Boston, MA: Bedford.

Meriam Library (2010). Evaluating information—Applying the CRAAP Test. Retrieved from http://www.csuchico.edu/lins/handouts/eval_websites.pdf

Radcliff, C., Cunningham, A., Hannon, R. H., Harmes, C., & Wiggins, R. (2018). The test. Threshold Achievement. Retrieved from https://thresholdachievement.com/the-test

Ramage, J. D., Bean, J. C., & Johnson J. (2009). Allyn and Bacon guide to writing (5th ed.). New York, NY: Pearson Longman.

Reid, S. (2011). Prentice Hall guide for college writers (9th ed.). Boston, MA: Prentice Hall.

Schulte, S. (2012). Embedded academic librarianship: A review of the literature. Evidence-Based Library and Information Practice, 7(4), 122—138. Retrieved from ejournals.library.ualberta.ca/index.php/EBLIP/article/view/17466

Smit, D. (2004). The end of Composition Studies. Carbondale, IL: Southern Illinois University Press.

Vygotsky, L. (1978). Interaction between learning and development. From Mind and Society (pp. 79—91). Cambridge, MA: Harvard University Press.

Yancey, K. B., Robertson, L., & Taczak, K. (2014). Writing across contexts: Transfer, composition, and sites of writing. Logan, UT: Utah University Press.