Author Archives: Vivian Bagley
Author Archives: Vivian Bagley
Departments often have a high number of courses, however, which requires a very long final Excel spreadsheet that requires intensive study to understand. A list of courses, expectations, and coded entries can baffle the most experienced reviewer. As faculty members who have served as external reviewers can attest, such documents are often convoluted and their usefulness limited.
By implementing a visualization process, we were able to modify the format of the course responses and turn pages and pages of numbers into meaningful visual representations.
For example, one purpose of a review is to understand the types of assessments used in the department. By creating a spreadsheet with only two columns, one for course names and the other for assessment methods, we were able to create the following visual representation:
Assessment Bar Graph
The history department offered 109 courses in the 2011-2012 year. Instructors reported 14 different types of assessment methods. The figure and height of the bar shows the number of courses that used each type of assessment.
At a glance an external reviewer can see which assessment techniques are used most frequently.
Another form of visual representation shows the network between courses and assessment methods. Frequently used methods form a central core, while more unique assessments are peripheral.
Assessment Network Diagram
Another set of data records the intended learning outcomes for a course, as reported by the instructor. These are often in sentence form, and so cannot be analyzed in the same way. The most frequently occurring terms and phrases, however, can be seen in the following image:
Learning Outcomes Word Cloud
Similarly, the activities that instructors assign can be analyzed to show which student activities are most common across the department.
Unlike a complicated spreadsheet, these images are easy to understand, and make trends within the information easy to identify. External reviewers might save hours by identifying important trends through visualizations, then going to the original sheets for specific statistics.
Although information collected during a review process can be converted into the necessary formats for these types of visualizations, the process would be more efficient if all recording sheets were designed to facilitate visualizations. In the above cases, the raw information went through three or four different conversions before it could be used as input.
The benefits offered by these types of visualizations are substantial, which a lead reviewer for the history department recognized. She chose to include visualizations in her final report, making the argument that the review process was made easier to understand through images that represent complex information. The reviewer offered the following comments:
Departments doing self-assessments and reviewers writing reports for administrators and government can use visualizations to see very clearly and easily what has gone on in a department over the preceding years. For example, for our mid-sized Ontario history department, it is very clear from the Bar Graph that the signature form of pedagogy is the seminar discussion group and that teaching leadership skills is a priority for a large number of faculty who specifically grade students on leadership.
The same graph also shows that while traditional research essays continue to be assigned (70 courses), other sorts of writing assignments surpass them: document studies (24), book reviews (26), alternative assignments ( 25). Also, many essays are graded in stages, in which the professors examine proposals, outlines and bibliographies for students before the final writing process begins. If visualizations from previous reviews were available the reviewer of this history department could very quickly identify trends, comment on them, and ask pertinent questions.
The Network Diagram shows that this history department has a balance between traditional assessment methods shared by the majority of faculty (represented by large circles at the centre of the diagram) and a diversity of innovative methods (represented by the smaller circles at the periphery). Again, if reviewers could see a similar diagram from 2005, an evolution could be identified and an explanation sought.
The Word Cloud shows the extent to which criticial thinking skills are taught in this department. ”Analyse,” “Arguments,” and “Demonstrate” are the most often repeated words in the descriptions of students’ learning activities. Parents, students, and employers who wonder about the transferable skills and career prospects that history students have on leaving university need only look at this diagram to recognize the value of a history education.
Although May has come to an end, our work at the Digital Method Blog continues. The Digital Method Workshop Series was a success, and has attracted interest from a number of other people. The workshops had limited attendance but toward the final days, we hosted 4-5 dedicated faculty members and one enthusiastic staff member. Our final workshop on Text Analysis, which featured Voyant and IBM’s ManyEyes received a higher amount of interest than I would have predicted, but the possibilities of distant reading and data visualization are widespread and can solve problems that I’d not have imagined. We were happy to hear that some difficulties might be addressed using the tools we’ve talked about.
Our attendees were consistently willing to try out the new tools and in many cases were excited to apply digital tools to their own research. We also heard reports that word of mouth was spreading the idea that digital tools could augment and improve research methods. If we decide to host another series in the future, perhaps we’ll have even more participants.
Currently, I’m working to edit the lengthy videos that captured the demonstrations and conversations from each workshop. In the next week, most of these videos will be available on the Workshops page. For anyone who missed the workshops, these videos might prove valuable because they often feature questions from our first time users. Also, visit the Tools page for overviews of each tool.
In our final workshop on May 25th, we will be exploring a number of tools that can be used to analyze text at a large scale, a task for which computers are uniquely suited. In his book Graphs, Maps, Trees, Franco Moretti coined the term “distant reading” to describe analysis that uses a computer to quantify text in order to see trends on a global scale that are not visible through close reading. There are many examples of this type of work. Patricia Cohen at the New York Times provides a thorough introduction and overview of the concept and mentions a number of examples.
We will focus on two tools that offer a number of options for graphing, revealing, and analyzing texts. The first is Voyant Tools, which displays results through text and displays statistical information for a text. The other is IBM’s ManyEyes, which provides numerous visualization options to best highlight the important trends in a work. The trick with these tools, as with many others, is experimenting. Upload a text and see what results you can draw out, especially by using different types of visualization in ManyEyes.
The results of distant reading are no substitute for close reading, but can open up the text to uncover larger themes and trends.
Friday’s workshop focuses on Scrivener, a writing tool designed to emphasize writing and leave formatting until the end of the process. For anyone interested in attending, please download and install the free trial of Scrivener so that you can begin to explore its functions and decide whether it is right for you. Feel free to invite anyone you think would benefit from these workshops.
We had our highest turnout on Wednesday and I’m hoping we repeat that success. Thank you to all our participants for your diligence and continued enthusiasm. Videos from the previous workshops will soon be available. Thank you for your patience.