Week 3: Evaluate Instrument for Education – Evaluation Tool Design/Development Part 1

After completing the literature review phase of my project, for this week, I began the second phase where I started designing and developing the tool for evaluating the effectiveness of training/education resources. In particular, since the tool would need to be useful primarily for evaluating the resources from DataONE, the tool needed to be able to accommodate the evaluation of training/education resources that could be delivered in-person, through online/web environment, as well as via text-based documents, such as presentations, handouts, and worksheets. Additionally, since my mentors and I intended to make the final tool available for the wider data management community, the tool also needed to be customizable and scalable to meet the needs and the context of the specific evaluation situation. Further, the literature reviews indicated that a digital questionnaire/survey could be the most viable evaluation tool due its flexibility for implementation, adaptation, and dissemination. As a result, following the best practices and guidelines that I was able to collect and learn during the first two week, the first revision of the evaluation tool I created contained the outline of a questionnaire/survey and is currently consisted of the following sections:
  • Title
  • Introduction
  • Instructions
  • Evaluation Questions
    • Objectives
    • Content/Substance
    • Reasoning/Topic Knowledge/Argument
    • Organization/Structure
    • Style/Language
    • Visual Aids
    • Interactivity/Technology
    • Timing/Time Management
    • Additional Comments
    • Demographic/Personal Reaction
  • Thank You
  • Related Links/Supporting Resources
For each of the sections, I provided the explanations of the purpose and the usage of the section. In addition, for the “Evaluation Questions” section, I combined and listed all the applicable questions that I found during the literature reviews. However, since there currently is a total of 87 questions, it is not practical to include all of the questions within the same questionnaire/survey, especially if our typical respondents might like to complete the questionnaire/survey within 5-10 minutes.
     As a result, for next week, I will focus on prioritizing the questions by suggesting if a question should be required, recommended, or optional. The aim for the result of the prioritization is that our respondents should be able to complete the amount of required questions within 5-10 minutes, and if a situation permits, even though including some of the recommended or optional questions would increase the length and the time for completion of the questionnaire/survey, the added questions would allow respondents to provide additional feedback with greater detail. Another area of the questionnaire/survey that I will also work on during next week is to consider the question type (e.g. multiple choice, scalar, true/false or yes/no, etc.) that each question might use. Combining multiple questions using the same question format might potentially help in making the questionnaire/survey look more organized and efficient. Finally, I would indicate which questions might be applicable for pre-training and during-training evaluations. Having the respondents provide feedback to the same questions or in similar wording at different phases of the training could help us in tracking if there is any change in the respondents attitude and behavior.
  Aside from creating the first draft of the questionnaire/survey, I also performed two minor literature reviews regarding the following: 1) different methods that are available to implement questionnaires/surveys digitally, and 2) techniques that are available and can be used by DataONE for delivering data management training. These two literature reviews were to help in informing possible implementation options for phase 3 of the project and any future DataONE training activities. Both reviews also focused on gathering experiences and lessons learned from other projects and organizations, so I mainly looked for white papers and grey literature that could be located via Google. The search results indicated that there were three categories of tools/platforms that could be used to implement a digital questionnaire/survey: web-based tools, apps/widget, and software packages. Also, while DataONE had already included some of the most popular training delivery methods, there were methods that could be considered by DataONE to further enhance its education/training delivery process. I have summarized the results for both literature reviews and will be discussing the results with my mentors during our next weekly meeting.

Leave a Reply

Your email address will not be published. Required fields are marked *

*