Blog

Evaluating Digital Work: Suggestions

We migrated this website to a new platform, and are working to correct formatting errors in older blog posts as a result. If you encounter an error, please send an email to scholarslab@virginia.edu. Thanks!

For this week’s post, we were asked to evaluate a few of the digital tools we looked at for last week’s meeting. This time we were armed with a list of questions, which I was particularly eager to have as someone with a fair amount of experience evaluating literary work and none whatsoever evaluating digital work. I was relieved to find that the questions are similar to those that must be asked of any scholarly article or book. What problem does it try to solve? What contribution does it make? How would it affect yours and others’ work? The big one: Does it work? And so on. Following these guidelines, we were asked to identify one thing we’d like to change, how we’d change it, and who would benefit from the change.

That being said, here are my suggestions (humbly submitted):

  • Linguistic Atlas Projects: The site compiles a group of projects studying English dialects in the US submitted by institutions across the nation, but I think the site would benefit from some kind of uniformity of layout and content. It’s intended to be a collection of these linguistic studies, but they’re treated very separately. Once you’ve selected one project from the side column, the other projects disappear. Some projects’ links will take you to to a map in which you can select a state to review, while others take you to a basic info page. Some projects’ pages offer images, recordings, and detailed project descriptions complete with bibliographies, while others offer only maps and subject data. I imagine it’s quite difficult to put projects with different methodologies in context with one another, but a uniform format would allow users to move between the different projects without becoming disoriented.

  • Visualeyes: I was so excited by the possibilities of Visualeyes and so anxious to figure out how to use it, the only suggestion I’d make would be to get more basic with tutorials. Visualeyes makes itself available as a non-programmers’ tool with many possible uses, and so there are many tutorials featured, but I had difficulty finding a sort of “Visualeyes for Dummies” video that would give me an overview starting at ground zero. The “Visualeyes Project Guide” document looked like a promising written guide, but its page was unavailable. I wasn’t quite sure where to begin.

  • OldWeather: I enjoy the interactive, collaborative nature of sites like this one and What’s on the Menu. OldWeather in particular pushes for public involvement, but so much so that the bigger picture becomes unclear. The home page highlights the transcription “game” while providing just a few introductory sentences and a short intro video for the program itself. There seems to be little information available on how the data will be interpreted, what the projected outline or timeline of the work will be, how the project responds to other research being done in the field, etc. The goals of the project should be clearly advertised and delineated on the site, which will then give users a better idea of why they should participate.

Cite this post: Brooke Lestock. “Evaluating Digital Work: Suggestions”. Published September 05, 2011. https://scholarslab.lib.virginia.edu/blog/evaluating-digital-work-suggestions/. Accessed on .