jpom + inls890121 + reading   41

Lib-Value Workshop - Bruce Kingma - YouTube
Methodological considerations from an economics perspective by Bruce Kingma, Associate Provost for Entrepreneurship and Innovation at Syracuse University

Return on Investment (ROI) - Lib-Value Workshop
June 26, 2010
Gelman Library
George Washington University
Washington DC

Lib-Value is a three-year study, funded by a grant from IMLS, with the ultimate goal of understanding how to measure value and return on investment (ROI) in all aspects of academic libraries. The workshop is a collaborative effort of ARL with the University of Tennessee, University of Illinois at Urbana-Champaign, and George Washington University.
inls890121  reading  economics  libraries  value  academic  ROI  evaluation 
march 2012 by jpom
LibQUAL+® Webcast
A workshop designed to provide potential and current participants with vital information on the LibQUAL+® service. This one-hour webcast provides practical information for administering a survey, help participants with interpreting the data and its analysis, and share best practices in using the results.

Key members of the LibQUAL+® team, Martha Kyrillidou and David Green, hosted the webcast. Our guest presenters were:
- Sandra Phoenix, Executive Director of the HBCU Library Alliance,
- Carla Stoffle, Dean, University Libraries and Center for Creative Photography, University of Arizona, and
- Chestalene Pintozzi, Director of Project Management & Assessment, University of Arizona
inls890121  reading  LibQUAL  ARL 
march 2012 by jpom
AEA: Guiding Principles for Evaluators
Evaluation is a profession composed of persons with varying interests, potentially encompassing but not limited to the evaluation of programs, products, personnel, policy, performance, proposals, technology, research, theory, and even of evaluation itself. These principles are broadly intended to cover all kinds of evaluation. For external evaluations of public programs, they nearly always apply.  However, it is impossible to write guiding principles that neatly fit every context in which evaluators work, and some evaluators will work in contexts in which following a guideline cannot be done for good reason. The Guiding Principles are not intended to constrain such evaluators when this is the case. However, such exceptions should be made for good reason (e.g., legal prohibitions against releasing information to stakeholders), and evaluators who find themselves in such contexts should consult colleagues about how to proceed.
inls890121  principles  evaluation  reading 
march 2012 by jpom
SUSHI Reports Registry - National Information Standards Organization
The SUSHI Reports Registry provides a listing of the standard report names and releases for COUNTER reports that should be used when implementing the schema. Also includes a registry of non-COUNTER reports that have been developed to work with the SUSHI protocol.
inls890121  reading  COUNTER  SUSHI  report 
february 2012 by jpom
Code of Practice for Journals and Databases, Appendix E: Auditing Requirements and Tests
The COUNTER Auditing requirements are needed to ensure that the usage reports provided by vendors are in line with the COUNTER principles of credibility, consistency and compatibility. For this purpose COUNTER has defined specific audit test-scripts for each of the COUNTER required usage reports. As the majority of vendors will work with their own auditor, the test-scripts will guarantee that each of them will follow an identical auditing procedure and result measurement.
inls890121  reading  COUNTER 
february 2012 by jpom
The COUNTER Code of Practice for e-Resources: Draft Release 4
COUNTER provides an international, extendible Code of Practice for E-resources that allows the usage of online information products and services to be measured in a credible, consistent and compatible way using vendor-generated data. Release 4 is an integrated Code of Practice covering journals, databases, books, reference works and multimedia content. It replaces both Release 3 of the Code of Practice for Journals and Databases and Release 1 of the Code of Practice for Books and Reference Works. The deadline date for its implementation is 31 December 2013. After this date only those vendors compliant with Release 4 will be considered to be COUNTER compliant. Release 4 contains the following new features...
inls890121  reading  COUNTER 
february 2012 by jpom
Where We’ve Been and Where We’re Going: Experts Reflect and Look Ahead / Reflecting on the Past and Future of Evaluation / The Evaluation Exchange. Read the section: On Evaluation Use: Evaluative Thinking and Process Use, by Michael Quinn Patton.
We asked six experts to reflect on their areas of expertise in evaluation and respond to two questions: (1) Looking through the lens of your unique expertise in evaluation, how is evaluation different today from what it was 10 years ago? and (2) In light of your response, how should evaluators or evaluation adapt to be better prepared for the future?
inls890121  reading  evaluation  evaluative_thinking 
february 2012 by jpom
Studying Students: The Undergraduate Research Project at the University of Rochester. Read ch. 4
Our first task was to identify one trenchant research question to guide the project. The question we developed was, What do students really do when they write their research papers? Between the assignment of a research paper and the finished, submitted product was a black box that largely concealed the processes undertaken by the student. We wanted to take a peek into that box to see what we could find. We felt that this question accurately reflected our ignorance of student work habits while providing a manageable focus for our information-gathering activities. We took a general approach, avoiding presuppositions. We wanted to begin our project by exploring students’ practices; we did not set out to prove a point. Our initial aim was to be able to describe in detail how students actually write their research papers. This would enable the library staff to develop new ways to help students meet faculty expectations for research papers and become adept researchers.
inls890121  inls490121  academic  reading 
february 2012 by jpom
Matthews, Library Assessment in Higher Education, ch. 8
The development of a library assessment plan requires a substantial commitment on the part of the library director. The director must provide strong leadership since assessment will require resources -- money for staff to be released from their regular duties, money for surveys and other data collection efforts, and so forth. Even if the staff are excited about assessment, it will be difficult, if not impossible, to develop an assessment plan and see it through to fruition without the wholehearted support of the director and other top administrative staff.
inls890121  reading  highered  libraries  academic  assessment  evaluation  university 
january 2012 by jpom
Yarbrough, et al. (2011). The program evaluation standards: A guide for evaluators and evaluation users
Yarbrough, et al. (2011). The program evaluation standards: A guide for evaluators and evaluation users. Thousand Oaks, CA: SAGE.
* Introduction
* Applying the Standards
* The Functional Table of Standards
inls890121  reading  program  evaluation  standards 
january 2012 by jpom
Program Evaluation Standards Statements « Joint Committee on Standards for Educational Evaluation
In order to gain familiarity with the conceptual and practical foundations of these standards and their applications to extended cases, the JCSEE strongly encourages all evaluators and evaluation users to read the complete book, available for purchase at http://www.sagepub.com/booksProdDesc.nav?prodId=Book230597& and referenced as follows:

Yarbrough, D. B., Shulha, L. M., Hopson, R. K., and Caruthers, F. A. (2011). The program evaluation standards: A guide for evaluators and evaluation users (3rd ed.). Thousand Oaks, CA: Sage
inls890121  program  evaluation  standards  reading  JCSEE 
january 2012 by jpom
Duke University | LINK: Assessment
In August 2008, the Duke Teaching and Learning Center “Link” opened in the Perkins Library Lower Level. The technology-enhanced classrooms and group study spaces have been particularly designed to accommodate and encourage collaborative work and project-based learning activities. This project represents the culmination of years of planning and has been influenced by several recently renovated prototype spaces. As Duke prepares to undertake a significant amount of classroom construction and renovation over the next decade, this project represents a significant opportunity for evaluation and assessment to inform the many academic space planning decisions that lie ahead. 
inls890121  reading  Duke  Link  evaluation  space 
november 2011 by jpom
Davidson, Evaluation Methodology Basics
Davidson, E. J. (2004). Evaluation Methodology Basics: The Nuts and Bolts of Sound Evaluation. Thousand Oaks, Ca: SAGE Publications. Chapters 1, 2, & 3.
inls490121  inls890121  reading  evaluation 
september 2011 by jpom
Cornell University Library value calculations
How do we quantify... the contributions the library makes in return to the university? Research libraries are not used to assigning a monetary value to the use of their collections, services and expertise, although public libraries have been moving into this direction in the past few years. Borrowing some of the methods public libraries use, RAU has calculated dollar values for some core library transactions. This is only an illustration and is by no means an exhaustive list of the ways the library contributes to the university. It cost $56,678,222 to maintain Cornell’s 20 libraries in 2008/2009. If CUL did not exist, the university would have had to pay $90,648,785 last year to secure services that are comparable to the use that the Cornell community makes of the library.
inls890121  ROI  academic  libraries  evaluation  reading 
september 2011 by jpom
A Conceptual Framework for the Holistic Measurement and Cumulative Evaluation of Library Services
This conceptual piece presents a framework to aid libraries in gaining a more thorough and holistic understanding of their users and services. Through a presentation of the history of library evaluation, a measurement matrix is developed that demonstrates the relationship between the topics and perspectives of measurement. These measurements are then combined through evaluation criteria, and then different participants in the library system view those criteria for decision-making. By implementing this framework for holistic measurement and cumulative evaluation, library evaluators can gain a more holistic knowledge of the library system and library administrators can be better informed for their decision-making processes.
inls890121  reading  evaluation  theory  framework  perspective 
september 2011 by jpom
Evaluability assessment: a primer. Trevisan, Michael S. & Yi Min Huang
Conducting evaluations of programs that are useful to decision makers is the hallmark of successful evaluation. Appropriate program implementation and operation are critical to this work. A strategy that can be used to determine the extent to which a program is ready for full evaluation, is known as evaluability assessment. Initially developed by Wholey (1979), evaluability assessment (EA) seeks to gain information from important documents and input from stakeholders concerning the content and objectives of the program. Outcomes from EA include clear objectives, performance indicators, and options for program improvement. Wholey (1979) recommended EA as an initial step to evaluating programs, increasing the likelihood that evaluations will provide timely, relevant, and responsive evaluation findings for decision makers.
inls890121  reading  evaluation  evaluability  program 
september 2011 by jpom
Creating a Culture of Assessment: A Catalyst for Organizational Change
In the rapidly changing information environment, libraries have to demonstrate that their services have relevance, value, and impact for stakeholders and customers. To deliver effective and high quality services, libraries have to assess their performance from the customer point of view. Moving to an assessment framework will be more successful if staff and leaders understand what is involved in organizational culture change. This paper describes the new paradigm of building a culture of assessment, and places it in the framework of organizational culture change, utilizing a learning organization and systems thinking approach.
inls890121  reading  culture  assessment  libraries  services  stakeholders 
september 2011 by jpom
Program Assessment in Academic Libraries: An Introduction for Assessment Practitioners
Although academic libraries have a long tradition of program assessment, in the past the results have been more meaningful internally than externally. Recent changes in the conceptualization of libraries’ role in higher education and advances in measurement tools will likely provide answers to different questions, particularly the relationship of library services and resources to student learning and success.
inls890121  reading  program  assessment  evaluation  academic  libraries  filetype:pdf  media:document 
september 2011 by jpom
SPEC Kit 303: Library Assessment
Read: Executive Summary & Survey Questions and Responses. Survey results indicate that while a modest number of libraries in the 1980s and earlier engaged in assessment activities beyond annual ARL statistics gathering, the biggest jump in activity occurred between 1990 and 2004. The overwhelming majority of responses indicate the impetus was service driven and user centered and came from within the library itself rather than from an outside source. Respondents’ top impetus for beginning assessment activities (63 respondents or 91%) was the desire to know more about their customers. Based on responses to a question about their first assessment activities, over half began with a survey, almost all of which were user surveys.
inls890121  reading  evaluation  evaluability  program  assessment  ARL 
september 2011 by jpom
Taxpayer Return on Investment in Florida Public Libraries: Summary Report September 2004
Read Part II: Detailed Results & Study Methods. Drill down into the parts of Part 4, Methods, that interest you.
inls890121  ROI  evaluation  FL  reading 
september 2011 by jpom
A Multi-Dimensional Framework for Academic Support: Final Report
The University of Minnesota Libraries received support from the Andrew W. Mellon Foundation to develop a multi-dimensional model for assessing support for scholarship in the context of a large research campus. The project team explored discipline-specific needs for facilities, information content, services, tools, and expertise in the humanities and social sciences. The goal was to develop a model for bringing greater coherence to these distributed resources through physical and virtual means, and also a research support environment that could be modeled, prototyped, and evaluated. The study is also being used to assist the academic leadership in understanding how libraries can promote the physically boundless nature of inquiry and information use.
inls890121  reading  academic  libraries  research  services  development  design 
september 2011 by jpom
The Balanced Scorecard: Measures That Drive Performance
During a year-long research project with 12 companies at the leading edge of performance measurement, we devised a "balanced scorecard"-a set of measures that gives top managers a fast but comprehensive view of the business. The balanced scorecard includes financial measures that tell the results of actions already taken. And it complements the financial measures with operational measures on customer satisfaction, internal processes, and the organization's innovation and iniprovement activities-operational measures that are the drivers of future financial performance.
inls890121  reading  balanced_scorecard  evaluation  corporation  management  filetype:pdf  media:document 
september 2011 by jpom
Research Library Issues, no. 271 (Aug. 2010): Special Issue on Value in Libraries: Assessing Organizational Performance
Library Value May Be Proven, If Not Self-Evident

A Decade of Assessment at a Research-Extensive University Library Using LibQUAL+®

LibQUAL+® and the “Library as Place” at the University of Glasgow

Service Quality Assessment with LibQUAL+® in Challenging Times: LibQUAL+® at Cranfield University

ARL Profiles: Qualitative Descriptions of Research Libraries in the Early 21st Century

The ARL Library Scorecard Pilot: Using the Balanced Scorecard in Research Libraries

Lib-Value: Measuring Value and Return on Investment of Academic Libraries

The Value of Electronic Resources: Measuring the Impact of Networked Electronic Services (MINES for Libraries®) at the Ontario Council of University Libraries
inls890121  reading  ARL  libraries  value  LibQUAL 
september 2011 by jpom
Duke Libraries > Library Assessment > User Studies Initiative
Goals * Increase knowledge and skills of library staff about social science research methods and best practices for studying user behavior * Provide a forum for discussing important findings from major studies of library user behavior and implications for our services * Foster collaboration among librarians to conduct user studies * Build a support structure and network for librarians interested in conducting user studies * Support at least one user study that results in a report suitable for publication via the library web site and/or local event by June 2010
inls890121  reading  duke  assessment 
september 2011 by jpom
Balanced Scorecard, UVa Library
Read the following sections: Metrics for 2007-2009 Results > Longitudinal Results 2002-2009 > both Results & Pie Charts
inls890121  reading  balanced_scorecard  evaluation  academic  libraries 
september 2011 by jpom
Matthews, The Evaluation and Measurement of Library Services, Ch 2
Matthews, J. R. (2007). The Evaluation and Measurement of Library Services. Westport, CT: Libraries Unlimited.
inls890121  reading  evaluation  libraries  services  textbook 
september 2011 by jpom
Duke Libraries > Library Assessment > Web Assessment Projects
Below is a list of proposed and active web assessment projects designed to improve the experience of our patrons when using Duke's library resources and tools.
inls890121  reading  duke  assessment  web 
september 2011 by jpom
E-Metrics Instructional System
Read the following sections on the EMIS site: ** Introduction to EMIS ** Introduction to e-metrics ** E-Metric Selection ** E-Metric Data Collection ** Any others that strike your fancy
metrics  inls890121  reading  libraries  services 
september 2011 by jpom
Measuring the Quality of Library Service through LibQUAL+
In: Academic Library Research: Perspectives and Current Trends, M. Radford and P. Snelson, eds. Chicago: Association of College and Research Libraries. 253-301.
inls890121  reading  LibQUAL  filetype:pdf  media:document 
september 2011 by jpom
Utilization-Focused Evaluation, ch.2
Patton, M. Q. (2008). Utilization-Focused Evaluation. Thousand Oaks: Sage Publications. Ch. 2.
inls890121  reading  utilization-focused  evaluation 
september 2011 by jpom
Proceedings of the 2008 Library Assessment Conference
Read the Plenary Session papers: Keynote Panel and Reaction intro, Gibbons, Luce, Wilson, Rapp, & Town
inls890121  reading  assessment  filetype:pdf  media:document 
september 2011 by jpom
A content analysis on the use of methods in online user research
Purposeful data results from an expressed purpose in combination with an adequate method. Data gathering is an essential part of online user studies, and every method has its areas of application and its limitations: quantitative surveys are limited in their ability to detect causal relations; with qualitative interviews broad generalizations are risky. In library and information science, user research is a domain in which we gather large amounts of data. But is our data really "purposeful"? Already in 1972, Frank Heidtmann (p. 36-37) made the criticism that we use inadequate research techniques and that these research techniques are – independent of their appropriateness – used in an inaccurate and invalid way.
inls890121  methodology  content_analysis  reading  filetype:pdf  media:document 
september 2011 by jpom
IMLS - Outcome Based Evaluation Overview
IMLS defines outcomes as benefits to people: specifically, achievements or changes in skill, knowledge, attitude, behavior, condition, or life status for program participants (“visitors will know what architecture contributes to their environments,” “participant literacy will improve”). Any project intended to create these kinds of benefits has outcome goals. Outcome-based evaluation, “OBE,” is the measurement of results. It identifies observations that can credibly demonstrate change or desirable conditions (“increased quality of work in the annual science fair,” “interest in family history,” “ability to use information effectively”). It systematically collects information about these indicators, and uses that information to show the extent to which a program achieved its goals.
inls890121  reading  outcome-based  evaluation  OBE  IMLS 
september 2011 by jpom
ACRL Value of Academic Libraries Report
The ACRL publication Value of Academic Libraries: A Comprehensive Research Review and Report is a review of the quantitative and qualitative literature, methodologies and best practices currently in place for demonstrating the value of academic libraries, developed for ACRL by Megan Oakleaf of the iSchool at Syracuse University. The primary objective of this comprehensive review is to provide academic librarians with a clearer understanding of what research about the performance of academic libraries already exists, where gaps in this research occur, and to identify the most promising best practices and measures correlated to performance.
inls890121  value  academic  libraries  evaluation  assessment  ACRL  reading 
september 2011 by jpom

Copy this bookmark:



description:


tags: