2018-04-11 Workshop Study Group
See bottom for the original email outlining the scope of this meeting.
Attendees
Tyler Richards, Justin Millar, Francois Michonneau, Tim Young, Matt Collins, Plato Smith, Dan Maxwell, Natalie Ridgewell, Deb Paul (remote)
Agenda
Introductions!
Brainstorm a list of data sources that might be relevant to assessing if student-lead learning leads to improved outcomes
Original list:
- Attendance lists at DSI (dsiufl.org) student organization workshops, socials, officers, general body meetings, etc.
- Possibly attendance from other student organizations if we ask them
- Attendance at Carpentry workshops
- Pre and post Carpentry workshop surveys
- Attendance and involvement in meetups around R or other topics
- Registrations from UFII/UFBI/Library symposia and talks
- Complete academic records for students (grades, courses, majors, surveys)
- Publication and other academic productivity from faculty
- Staff career records (titles, salary, review ratings)
Additional ideas:
- Student take a pre-graduation survey about whether they have a job and their salary, Tim has this data
- List of on-book courses that use stats, R, Python et al - See Danis Valle’s Shiny app for the last few semesters
- Qualitative data - interviews, focus groups, documentation of anecdotes, research memos - these can capture feedback we get like “you just saved me a lot of time” or 1-on-1 work similar to tutoring
- New data source - Natalie suggested a unified survey of all people from all activities - a common assessment
- New data source - Natalie - interviewing companies that hire our graduates about how well the new hires are doing, Tim says this isn’t done widely
How/where do we assess the feeling of membership in a community? Assess development of a culture that UF wants to foster? Dan observes that other top institutions have cultures and communities with resources. Tyler suggests you build these with giving people space and food (literal and figurative).
What additional data capture going forward can we unify across groups?
List organizations and people that might hold this data
Identify who is interested in assembling this data and continuing to form hypotheses to test
One thing to examine is the overlap of people between groups, how large is the total population that attends all these events? What is the rate of repeated attendance? Are there clusters of attendance?
Identify who will pursue getting IRB approval to do a study on this data
Are we going to have a unified goal for the study? - Possibly not or at least we may pursue different wings of this for all of our own goals.
Other next steps
Explore a blanket IRB for students, up-front consent for future studies. This could be part of a sign-in form for people to opt in to allowing their data to be used for research.
How do we detect people who are already succeeding vs people who are ready to succeed vs people who are not? We can’t just look at average success to understand our impact.
Tim - publication is one target but the audience in administration is another and that may not require as much approval for joining data sets - reflect this in the assessment of data sets
Other points
Matt personal observation - it was difficult to take notes as ideas and observations were being made quickly by everyone. In general there is a lot we can do and a lot that people want to investigate. The limit seems to be around the concrete access to data and any restrictions around it.
Action items
- Tyler - Continue assembling a data set of DSI event participation
- Justin - Continue assembling a data set of R group participation
- Matt - Make a commentable spreadsheet of data sources and identify owers, status, restrictions to their use, etc
Follow-up emails
- From Geraldine: My addition: is there any way to gauge the use and effectiveness of student-led online learning such as Lynda.com (maybe to compare to the in-person activities of DSI and the Carpentry club)?
Original invitation email
I’m writing to invite you to participate in an assessment of effects of all the teaching and learning that has been happening at UF outside of traditional curricula. Such a study would provide all the groups involved with evidence of their impact. It would also I think provide a more robust and data-backed response to this recent paper that claims bootcamps and workshops have no effects:
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5604013/ https://software-carpentry.org/blog/2017/12/response-null-effects.html
UF is perhaps unique in that we have three years of history of multiple types of organizations putting on multiple types of workshops and events focused on teaching skills outside the traditional curriculum. The attendees are also mostly students for whom UF has complete academic records for so we know their assessed outcomes (ie grades in courses) as well as post-graduation placements and surveys.
We could have access to data like:
- Attendance lists at DSI (dsiufl.org) student organization workshops, socials, officers, general body meetings, etc.
- Possibly attendance from other student organizations if we ask them
- Attendance at Carpentry workshops
- Pre and post Carpentry workshop surveys
- Attendance and involvement in meetups around R or other topics
- Registrations from UFII/UFBI/Library symposia and talks
- Complete academic records for students (grades, courses, majors, surveys)
- Publication and other academic productivity from faculty
- Staff career records (titles, salary, review ratings)
This is a lot of data that we can put together. Some possible hypotheses include does training outside the curriculum impact grades? Courses taken? Major choice? Does participating in an organization or student group impact grades or perhaps vice-versa? Is there any impact in staff or faculty outcomes? And many more…
Are you interested in participating in a call to discuss:
a) Is this a good idea and worth the not-trivial work? b) Who can we get this data collected and formatted for analysis? c) What IRB approval are we going to need and when do we need to get it? d) Who will do the analysis and what privacy training do they need?