Choosing screencast software – Screencast Tutorials, Week #1

After a successful face-to-face meeting in Santa Barbara between mentor and intern, we have a nine week project timeline. A major goal for this project this summer is to develop a workflow and templates for making screencasts as a product, so the beginning weeks will be spent researching tools (Week 1) and style guidelines (Week 2) before any recording takes place (Weeks 4-6), as well as prioritizing which DataONE and partner tools to develop tutorials for (Week 3). As a landmark, I will attend the DataONE User Group meeting in early July with the intention of airing examplar screencast tutorials for feedback as well as for more controlled usability testing (polling preference for version A vs. version B, for example). The remaining two weeks of the project will be devoted to recording and editing videos based on those data, aiming to finish a complete set of tutorials for one tool.


Week one focused on software for screencapture, video editing, and audio editing. My initial instinct was to separate these three tasks among three dedicated pieces of software, but, as it turns out, the twoย  pieces of software most feature-rich for screencapture are also great video editors and effective audio editors.

A comparison of software compare is in the screencast software Google spreadsheet. The most important feature that separates the high-end software from the free editors and the simple screenrecorders is the ability to separately record mouse actions. Having these data on a separate track means that, after recording, one can hide or edit the mouse cursor in the recordings, including eliminating stray movements, keeping the cursor from obscuring part of the screen, highlighting the cursor, changing the cursor image, indicating mouse button clicks visually and audibly, and other options. These features allow for stylistic flexibility without the need to record many times over, so the ability to record data on mouse motions separately was my first requirement for the software so that I can rapidly reflecting insights learned at the DUG meeting in previously recorded projects. For our situation, we need also software that works on Mac OS and, at least for the summer, Windows. These two requirements narrowed the field considerably.

Camtasia Studio for Windows/Camtasia for Mac or Adobe Captivate would be therefore my recommended go-to software packages for the all three steps of screencast tutorial generation: screencapture, video editing, and audio editing. If a more powerful audio editor is needed, Audacity is available for free on Windows, Linux, and Mac, though I don’t anticipate needed much more than a simple noise reduction pass and some track splicing tools.

Camtasia for Mac is less expensive than Camtasia Studio for Windows ($75 vs. $179, educational pricing), presumably because it has a reduced feature set: Camtasia Mac and Windows version features comparison chart. Most notably, the Mac version doesn’t support embedded quizzes and surveys, nor presets, nor batch production; none of these should be a key feature in screencast tutorial production, but might be a consideration if wanting to use Camtasia for webinars.
With my cursory poking through the free, full-feature 30-day trial of Camtasia for Windows today, I find the user interface to be intuitive.

The biggest competition for Camtasia is the more expensive Adobe Captivate ($300 educational license). Captivate can do what Camtasia does and also allows for editing the mouse cursor to follow a custom path or to slow down before clicking. Captivate only recently included the ability to record the screen; it is a software originally designed for developing interactive media, in HTML5 in particular. Thus, Captivate would be a better choice than Camtasia for developing videos with quizzes and other interactivity. (Camtasia is not a choice for this when working on a Mac.)

This article on choosing a screencast tool outlines the many questions to ask oneself in the quest for the most suitable software. Some stylistic choices are also covered; I’ll be referring to them next week.


For next week, I’ll be reading primary and secondary literature on designing screencasts for an optimized user experience in order to generate a data-centric list of recommendations regarding style choices, as well as a list of contrasting styles for testing at the July DUG meeting. I’m also in contact with the Data stories intern to the extent that our projects overlap – we’re discussing the possibility of shared intro/outro clips and other ways to maintain cohesive branding.

4 Replies to “Choosing screencast software – Screencast Tutorials, Week #1”

  1. I really like your spreadsheet comparison chart. I look for that type of info every time I’m looking at a new technological product. The only other thing to add to sometime compare that I find helpful is – the last time it was updated.

    I’ve only used Jing (because its free) but Camtasia sounds reasonable and useful!

    • Actually, there is a similar chart on Wikipedia that includes latest update time, though it has less information on features:
      http://en.m.wikipedia.org/wiki/Comparison_of_screencasting_software

      In some cases, latest update time can be misleading: the Linux commandline screen recorder RecordMyDesktop has not had a major update since 2007, which latest patch in 2008 and yet it is still widely used because it does what it should and generally does not break.

  2. Hi Heather, this is very helpful. Couple of requests / suggestions. The link out the the comparison chart doesn’t render very well – the text is overlapping in later columns and some column headers are offset. Also, can you provide some metadata (or a key) for your codes and colors? e.g green and orange, some cells have an X and are green, others just an X. I’m assuming that X means the feature exists and green is good but these are assumptions :). Also, some comment on cross platform compatability would be useful, per the recent emails.
    Thanks

Leave a Reply

Your email address will not be published. Required fields are marked *

*