{"id":2758,"date":"2016-06-04T00:57:55","date_gmt":"2016-06-04T00:57:55","guid":{"rendered":"https:\/\/notebooks.dataone.org\/?p=2758"},"modified":"2016-06-04T01:04:55","modified_gmt":"2016-06-04T01:04:55","slug":"week-3-evaluate-instrument-for-education-evaluation-tool-designdevelopment-part-1","status":"publish","type":"post","link":"https:\/\/notebooks.dataone.org\/education-evaluation\/week-3-evaluate-instrument-for-education-evaluation-tool-designdevelopment-part-1\/","title":{"rendered":"Week 3: Evaluate Instrument for Education – Evaluation Tool Design\/Development Part 1"},"content":{"rendered":"
After completing the literature review phase of my project, for this week, I began the second phase where I started\u00a0designing and developing the tool for evaluating the effectiveness of training\/education resources. In particular, since the tool would need to be useful primarily for evaluating the resources from DataONE, the tool needed to be able to accommodate the evaluation of training\/education resources that could be delivered in-person, through online\/web environment, as well as via text-based documents, such as presentations, handouts, and worksheets. Additionally, since my mentors and I intended to make the final tool available for the wider data management community, the tool also needed to be customizable and scalable to meet the needs and the context of the specific evaluation situation. Further,\u00a0the\u00a0literature reviews indicated that a digital questionnaire\/survey could be the most viable evaluation tool due its flexibility for implementation, adaptation, and dissemination. As a result, following the best practices and guidelines that I was able to collect and learn during the first two week, the first revision of the evaluation tool I created contained the\u00a0outline of a questionnaire\/survey and is currently consisted of the following sections:<\/div>\n