IRA

Forms

var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-48720098-1']); _gaq.push(['_trackPageview']); (function() { var ga = document.createElement('script'); ga.type = 'text/javascript'; ga.async = true; ga.src = ('https:' == document.location.protocol ? 'https://ssl' : 'http://www') + '.google-analytics.com/ga.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(ga, s); })();

Information Literacy

Information Literacy

The Information Literacy Learning Community formed in Fall 2016, and has met seven times.  They are working on the following:

  • formulating and finalizing a final definition. 
  • reviewing rubrics to use for future assessment
  • selecting potential upper-division courses for assessment in upper-division GE C4 and D5
  • organized two Information Literacy faculty workshops led by Megan Oakleaf in Jan 2106.

Assessment 2010-2012

Excerpts from the WASC EER Report (PDF)

The ULO Project on Lifelong Learning began in Spring 2010, when Kennedy Library conducted a survey of student information skills in consultation with the ULO Lifelong Learning Committee. Information skills are a foundational component of lifelong learning, and they contribute to other ULOs including written and oral communication.  

Method 

The survey was designed to identify student competencies by measuring performance on the Information Literacy Learning Objectives, which the library established in 2009. The survey presented students with a research scenario and asked them to respond to a series of 20 questions. Two versions were administered during a one-month period: one for lower-division and one for upper-division students. The versions differed by the order in which questions were asked and the wording of some questions. 

Invitations to participate were emailed to 1,332 lower-division and 2,905 upper-division students. In addition, an open invitation was posted on the library website, and instructors who had previously brought students for library instruction were encouraged to announce the survey to current students. Approximately 98% of the responses came from the email invitations. Without adjusting for the remaining 2%, the lower-division response rate was 28% (367 respondents) and the upper-division response rate was 20% (578 respondents). The high response rate likely resulted from the promise of cash prizes; however, not all respondents answered all questions.

Results

Figure 1.10 (PDF) presents the mean scores in terms of percent correct for five questions for which there was a single response. A statistical analysis was conducted to determine whether the correct response to each item was related to Class Level and Instruction; the latter factor distinguished between students who had and had not received library instruction in research methods. 

In all cases, upper-division students did better than lower-division students. For three of the five items—thesis statement/promising research question, correct identification of citation example, and correct selection of the search term that would yield the fewest results—Class Level had a significant effect, demonstrating value added. There was a marginal effect of Class Level on the correct selection of the search term that would yield the most results. Significant effects of Instruction were found for the thesis statement and correct identification of the citation example. The question on the ethical use of ideas showed no significant effects of either Class Level or Instruction. Across all analyses, no significant interactions between variables were present 

Appendix 1.1 has the full statistical analysis of WASC Student Learning Report (PDF) 

The results demonstrate value added across several items on the survey, indicating higher levels of information literacy at the upper-division level. In addition, promising results for the educational effectiveness of library-related instruction were also found, with some indication that lower-division students attending such instruction consistently scored almost as well as upper-division students who had not attended such sessions. It should be noted that the outcomes measured in this scenario-based questionnaire necessarily focused on the means of finding and identifying information rather than on the more complex evaluative and synthetic skills associated with the critical-thinking aspects of information literacy. 

Other Information on Assessment One

Summary of Lifelong Learning Committee Activities
DATE ACTIVITY
2008-2009

Assembled a committee to define lifelong learning, and develop the learning outcomes/lifelong learning rubric.
 

Definition of Lifelong Learning

Lifelong Learning (LLL) is comprised of the foundational attributes that enable graduates to navigate a constantly changing world in which success is highly contingent upon the continuous intake, evaluation, and deployment of information. Lifelong learning is comprised of formal [1], non-formal [2], and informal [3] learning, whether intentional or unanticipated, that occurs at any point after a student has graduated. Many attributes characterize the lifelong learner, and these are best developed during the undergraduate and graduate years.

Terms: 
[1] Formal learning is deliberate, structured, and leads to the acquisition of a formal certificate.
[2] Non-formal learning is deliberate, but does not take place in a classroom setting. The emphasis here is on the acquisition of a skill and not a formal certificate.
[3] Informal learning is not deliberate and usually occurs during conversation or observation where one picks up some ideas that are not facts but someone's attitudes, perceptions, or values.

 

Lifelong Learning Rubric

Defining Beginning, Developing, and Mastery Skills in Lifelong Learning (PDF)

2009-2010
  • Integrated the basic information and critical thinking skills (attribute 2) into GE Area A: Communication A1 & A3 lower-division courses
  • Administered online surveys to determine the effectiveness of the information and critical thinking skills taught through online and classroom instruction
  • Received feedback from the faculty to assess student's understanding and use of information resources and critical thinking in their class research project(s)
  • Administered both lower and upper-division scenario-based assessments to determine the level of student information and critical thinking skills
2010-2011
  • Administered online surveys to determine the effectiveness of the information and critical thinking skills taught through online and classroom instruction
  • Received feedback from the faculty to assess student's understanding and use of information resources and critical thinking in their class research project(s)
2011

Information Skills Assessment Reports

 

 

 

 

 

Related Content

Our Staff

Contact Giving
          Contact Us