Projects made in the class
- Ryan’s Letters (Nancy Brown)
- College Radio Archive (Elizabeth Hansen)
- Bernier Archive (Elaine Paul)
- Ancient Aztecs (Virginia Cole)
- Tagging the Caribbean (Krystal Thomas)
- I Love Your Funny Face! (Nancy Stephenson)
- Blast from the Past (Wendy Johnson)
- Teach Robots to Read (Jim Salmons & Timlynn Babitsky)
- Loray Digital Archive (Julie Davis)
- NovelMapper (Cathy DeRose)
Monday: overview, speed dating
Prompts for thinking about projects:
- How clear was the purpose of the site? How well was it reflected in the ‘call to action’ and other text?
- How easy was it to get started?
- Were the steps to complete the task clear?
- How enjoyable was the task?
- Did the reward (if any) feel appropriate?
- Looking at the site overall, does the project appear to be effective?
- What is the input content? What is the output content?
- What validation methods appear to have been used?
- Who is the probable audience and what motivates them to participate?
- How does the project let participants know they’re making a difference?
- Does the site support communication between participants?
- How was the site marketed to potential participants?
- Did the site anticipate your questions about the tasks?
The course abstract
Crowdsourcing projects often make news because they can be incredibly productive, generating millions of lines of text, identifying forgotten faces in historical photographs and even finding new planets. Successful crowdsourcing projects can help organizations forge deeper connection with audiences. We’ll look at the attributes of projects that successfully engage the public with their content and tasks, whether transcribing handwritten documents, correcting optical character recognition (OCR) errors in printed text, identifying animals on the Serengeti or folding proteins. Conversely, poorly-designed crowdsourcing projects find it difficult to attract or retain participants.
This class will present international case studies of best practice crowdsourcing projects to illustrate the range of tasks that can be crowdsourced, the motivations of participants and the characteristics of well-designed projects. We’ll study crowdsourcing projects from the worlds of citizen science, genealogy and free culture as well as historical and cultural heritage projects. We’ll investigate the special requirements of humanities materials and consider the impact of organizational models, user interface design and community input on the success of projects. We’ll discuss models for quality control, explore the cross-overs between traditional in-house volunteer projects and internet-enabled crowdsourcing, and look at the numbers behind real-world projects.
Finally, the course will give students hands-on experience with several different crowdsourcing platforms for image annotation, classification, manuscript transcription, and OCR correction, and an opportunity to review the behind-the-scenes interfaces and data outputs. Students will learn to analyse and describe various aspects of crowdsourcing projects through a combination of mini-lectures and discussion. Students are encouraged to bring their project ideas and some scanned material for the lab sessions. The presenters’ years of analysis of and practical experience with crowdsourcing projects will help create an intellectually rigorous and supportive environment in which students are encouraged to apply the material covered to their own contexts.
Want more? If you found this post useful, you might be interested in my book, Crowdsourcing Our Cultural Heritage.