Monthly Archives: March 2015

ACRL 2015: New models for new roles: creating liaison organizational structures that support modern priorities

Johns Hopkins – Creating the “academic liaison dept”

  • Don’t need to test it down
  • Research support, instruction, reference, and collections were the common base — one thing JMU hasn’t clearly included so far is research support
  • Used talent management dept to help review the dept and rebuild the team 
  • Re-evaluate research – many students use Wikipedia, Google, etc to deal with the questions that used to come to the reference desk — instead, questions are moving to how do I do my research
  • They hired a UX person (not lib, instead urban Anthropologist)

Villanova: creating the academic integration team 

  • Librarians would apply to teams based on interested
  • From depts to teams 
  •   
  • Added more subject librarians
  • Challenges related to accountability and time division for team work
  • Minimum team assignment is 20%
  • Have a tech team to help create ideas, but need more assessment help

UNC Greensboro

  • Benchmark with other libraries
  • moved to team based model
  • Also have functional teams (collections, instruction, reference desk, scholarly communications) – choice between there and teams change each year
  • Accomplishments: more training for each other, improve collab and communication across depts, curriculum mapping for info lit
  • Have annual retreat 
  • Has a document for liaison roles and expectations 
  • Recommendations – buy-in, align with university and library goals, tell your story, engage with the community

Questions

  • Assessment: Read scale, time spent, repeat interactions  — we don’t have that documented in this way

Advertisements

ACRL 2015: IL instruction papers

Strategic cartography: visualizing information literacy intersections across the curriculum

  • We are no longer the center of our institutions, but instead a piece of a very complex institution with different subcultures.  We are having to move to these new subcultures
  • Knowledge spans from implicit to tacit
  • Mapping done by the library provides new insight at the bird’s eye view level
  • Curriculum mapping — look at external sources (catalogs, schedules, websites), librarians annotate, then share map with faculty to reflect and discuss
  • Mapping is a consistent process and requires frequent updating
  • Course progression is less clear in Humanities so you need to acknowledge this and roughly estimate approximately areas where students might aggregate

Patterns in information literacy instruction: what’s really going on in our classrooms?

  • 88% of sessions focused on finding resources, particularly articles (43% via databases, 26% Google Scholar, 30% discovery layer)
  • We tend to continue to focus on introducing resources across all levels instead of expanding into new topics, but this may be due to evolving use of disciplinary research

  

The whole mix: instructional design, students, and assessment in blended learning

  • Students didn’t see relevance of evidence based medicine
  • Students picked a database to search but didn’t clarify a reason why they chose one database over another
  • Used ADDIE model to redesign the course

ACRL 2015: Promoting Data Literacy at the Grassroots: teaching and learning with data in the undergraduate classroom

origins

  • Reference (ex: what is the GDP of France?) – locate, access, cute
  • Data mgmt – project plan, metadata, storag/share

So what Instruction needs to occur btw data reference and data management 

Pedagogical models

  • Statistical literacy – exploratory data analysis (Cobb & Moore, 1997)
  • History teaching – heuristics of reading primary sources (Wineburg, 2001)

3 lesson plans

  • Discover data through literature
  • Evaluating data sets
  • Research design

Data through literature

  • ICPSR, etc serve as dataset source that faculty give to students and then to analyze it
  • File format access and discoverability are challenges to data reuse
  • Ex: sociological research methods course — students would find data from citation, understand value of secondary data analysis, see relationship btw data and literature, use documentation to evaluate a dataset
  • Built upon existing ICPSR module http://www.icpsr.umich.edu/icpsrweb/instructors/edrl/index.jsp

Evaluating datasets

  • Health and populations course about demography:
  • Students will understand the complex web of data products, use documentation to evaluate a dataset, articulate how variables relate to research questions
  • Focus on structures that produce data (in this case, gov and orgs)
  • Who cares about this topic to collect data? Who has authority/resources to collect data?
  • What can the data show? Trend, disparity, comparison, spatial pattern
  • How is data collected?
  • Use data that can be viewed and manipulated online (vs downloading) — ex: World Bank

Operationalzing/Research Design

  • This happens after learning about SPSS
  • Students will identify potential collars and disseminators of data, describe accessibility issues associated with data sources, Operationalzing a research question In order to develop a data search strategy
  • Who collects data and can I access it?



CFI: Creating aligned assignments

  • Intentionality
    • NOT saying students to will get it later, or that grading is assessment
    • IS reviewing the syllabus and discussing it together
      • mini-evaluation 1/3 way through the course, include reviewing the outcomes multiple times throughout the course
      • how many of you are X majors? would you be willing to share with the class your view of what someone with a bachelor’s degree in X should know and be able to do? — this parallels the conversations/presentations about different roles
      • bookend classes with an overview of what to do at the beginning and what to do at the end
  • NILOA (National Institute for Learning Outcomes Assessment) and curriculum mapping
    • NILOA Mission: discover and disseminate effective use of assessment data to strengthen undergraduate education and support institutions in their assessment efforts
    • Provosts consider classroom based assessment, national student surveys, and rubrics (ex: AACU rubrics – https://www.aacu.org/value-rubrics)
    • growth in different types of measures (employer surveys, rubrics, external performance assessment, portfolios)
    • how do we ensure alignment between assignments and a given learning outcome for a course?
    • curriculum mapping
      • generally, two-dimensional matrix representing courses on one axis and outcomes on the other
      • would multiple people map the outcomes the same way?
      • national level examples:
      • what else can be mapped?
        • spatial elements: GIS Communication
        • content
        • structure
        • course-taking patterns
        • assignment timing
    • creating value is better done through helping them discover value rather than telling them the value
    • many of the components of the DQP match other conversations in CHBS to include a more global perspective on health, integrating ethical reasoning into education, etc
    • what are you asking students to do or demonstrate in an assignment?
    • what is the role of feedback in assignment design? what is the timing of feedback?
    • have students involved in developing evaluation/grading rubrics?
    • how do we integrate and allow students to apply learning?
      • ex: after defining different roles in healthcare, ask who do they think has most interaction with EHRs, policy, etc.

Misc ideas

  • have students read about how to work as a team/collaborate – perhaps provide case scenarios and ask who should be involved and why; this would follow after the presentation regarding different professional roles in health care
  • for group work, have students document the roles they play in the group over time — this could give me a chance to track common leaders or followers and encourage others to try new roles — if students meet in groups, not in class, then have them provide summary of meeting