Welcom to E-portfolio

Evaluation in Educational Technology(TECH4102) is an important course. It gives us a lot of experience that will help us in evaluating Educational technology, choosing appropriate instrument and discussing several issues about evaluation in educational technology. As we working in lab section, we enjoy the works that is assigned by instructor.
This blog is an e-portfolio for assignments and researches in this course. We hope you enjoy our e-portfolio!
Course instructor: Dr Alaa Sadik
StudentID(s): u065932 & u061563

Sunday, May 10, 2009

Evaluation of Online Learning

This post presents our evaluation of online discussion forum in terms of degree of students' participation.
Find Whole article in this link
http://www.scribd.com/doc/15151281/EductionalFourmEvalution

Thursday, May 7, 2009

Models of evaluation in educational technology

http://www.slideshare.net/u065932/proposal-for-evaluating

This link is a powerpoint presention about proposal for Evaluating Omani E-portal using action model.

Tuesday, May 5, 2009

Comparative and non-comparative evaluation in educational technology

This post present two examples of comparative and non-comparative studies:

Comparative Study



  1. Title: Comparative Analysis of Learner Satisfaction and Learning Outcomes in Online and Face to Face Learning Environment

  2. Type of comparative study: learner's perception and performance

  3. Problem: Because the sharp growth of online programs in recent years, there is a need for researches to assess the capabilities and efficacy of online programs.

  4. Purpose of evaluation: compare an online course with an equivalent course taught in a traditional face-to face format.
  5. Question:
    What differences exist between students enrolled in online versus face-to-face learning environments in: *satisfaction with the learning experience?
    *student perceptions of student/instructor interaction, course structure, and course support ?
    *learning outcomes (i.e., perceived content knowledge, quality of course projects, and final course grades)?
  6. Participants: Students enrolled in instructional design course for human resource development professionals. *19 students: were taught on a traditional face-to-face format
    *19 students: were taught totally online.
  7. Instruments applied
    1) some items from the Instructor and Course Evaluation System (ICES)
    2) Course Interaction, Structure, and Support (CISS) instrument
    3) A course projects
    4) The final course grades
    5) A self-assessment instrument:

  8. Advantages of the study:
    1. Equivalence of the groups in number, same instructor and course, delivered by the same department, and required the same content, activities, and projects
    2. All data were collected at or near the end of the semester
    3.To ensure instrument validity, Researcher used many ways.
    a.To develop CISS the researcher contacted with authors of the (DOLES) and (DDE) instruments to obtain copies and necessary permission to use their instruments. b.Content experts reviewed the items of instrument.
    c.The instrument was pilot tested by 68 students
    d.Factor analysis procedures were used to establish the construct validity of the (CISS) instrument.

  9. Disadvantagesof the study:
    o The small sample size makes it difficult to interpret the result.
    o The CISS instrument is still in its early developmental stage and has not completed a full analysis to ensure reliability and validity.

  10. Results of the study:
    1.Student Satisfaction: on instructor quality and course quality, both groups provided positive ratings .
    2.Perception of course interaction, structure and support: Overall, both groups of students had positive perceptions. with the face-to-face students having significantly more positive views for interaction and support.
    3. Student Learning Outcomes:
    §Blind review of course projects: The difference in the project ratings for the two groups was not significant
    § Course grade: The grades were, for the most part, equally distributed between both groups
    § Self- assessment: Significant differences were found on only five of the 29 items on the self-assessment instrument

Note:At the end of the research, the researcher said: "These results support the argument that online instruction can be designed to be as effective as traditional face-to-face instruction"

11. Refernce:
JOHNSON, S. ARAGON,S. SHAIK,N. PALMARIVAS, N. (2000). Comparative Analysis of Learner Satisfaction and Learning Outcomes in Online and Face to Face Learning Environment. USA

Non-Comparative Study

  1. Title: Virtual interactivity: design factors affecting student satisfaction and perceived learning in asynchronous online courses
  2. Purpose: This study is a non-cooperative study. It investigates factors affecting student satisfaction with and perceived learning from asynchronous online learning.
  3. Participants: Participants of this study are Approximately 3,800 students who were enrolled in 264 courses offered through SLN but About (1,406) students returned the survey. Also, about 73 courses and eleven hundred and eight (1,108) students were enrolled in the. courses whose design features we examined.
  4. Evaluation instruments:
    This study is used online SLN Student Survey Data as a data collection of tool. It is consisted of mostly multiple choices, forced answer questions eliciting demographic information and information concerning students’ satisfaction, perceived learning, and activity in the courses they were taking. In addition, it has open-ended comments for respondents to add comments to the survey.
    Course Design Data: Two of the researchers separately examined each of the 73 courses and rat their content on twenty-two variables using Likert-type scaling. Ratings for each course were checked for agreement, and disagreements were resolved by consensus with reference to the courses themselves.
  5. Advantages:
    *Used more than one instrument for collecting data.
    *Gives some information about why the finding is that.
  6. Disadvantages:
    *No clear information about participants.
  7. Results:
    It shows high levels of satisfaction with and perceived learning from SLN courses in the Spring, 1999 semester. The findings also indicate that most students believed their level of interaction with the course materials, with their instructor, and with their peers was as high or higher than in traditional face-to-face courses.
    The study comes with that, there are three factors that contributing significantly to the success of online courses. Those factors are a clear and consistent course structure, an instructor who interacts frequently and constructively with students, and a valued and dynamic discussion.
  8. Refernce:

Swan, K. (2001). Virtual interactivity: design factors affecting student satisfaction and perceived learning in asynchronous online courses. Distance Education, 22, (2), 306-331.

Levels and techniques of evaluation in educational technology

This post explain the evaluation' level and techniques of the study that title "Evaluation of a Virtual Lab Environment for Teaching Web-Application Development. Dept. of CIS, Georgia State University"
  • Purpose of the study:
    To examine how one aspect of virtual computing – the virtual lab – effectively addresses many of the challenges of teaching web application development.
  • Approaches:
    The subjects for this study were drawn from two related graduate courses. One course (Principles of Web Design) had 13 respondents, while the other (Web application development) had 10 respondents.
    This study uses Statistical methods. It uses survey (Lickert-skale) of the related questions to evaluate H1-H4. while for hypothesis H5-H8 we used the non-parametric two-sample Kolmogorov-Smirnov test which is used to compare means in case of small sample sizes. The regression method enables researchers to identify the factors that most contribute towards the variance in the dependant variable. The dependent variable is ease of use and usefulness of virtual lab. The independent variables are students programming experience (in years).
  • Levels of Evaluation:
    The evaluation in this study was done in a program level. It comes with that student’s rate the VL a mean of usefulness (3.75 out of 5) and easy to use (3.45 out of 5).
  • Instruments:
    In this study, to collect the data for evaluating the virtual lab, a questionnaire Research instrument is used. The questions used in the questionnaire were derived from standard TAM questions (Meso & Liegle, 2005; Gallivan, 2001; Chircu et al., 2000; Straub et al., 1997) and further included standard demographic questions. Finally, data about the specifications of the primary computer used by each student and how – if at all – they had configured their web server was also collected.
  • Reference:
    Jens Liegle, Peter Meso.( 2005). Evaluation of a Virtual Lab Environment for Teaching Web-Application Development. Dept. of CIS, Georgia State University.Atlanta, GA, 30319, USA.

Evaluation strategies

Learning Object evaluation Instrument
LORI is an evaluation instrument. It was developed for evaluating the E-Learning online object. This instrument was developed by John Nesbit ,Karen Belfer, and Tracey Leacock. It measures the following points in LO:

1. Content Quality
2. Learning Goal Alignment
3. Feedback and Adaptation
4. Motivation
5. Presentation Design
6. Interaction Usability
7. Accessibility
8. Reusability
9. Standards Compliance
Below there is print screen of this instrument:












If you want more information in how to use go to this site



Evaluation of CAI

CBI course Evaluation instrument is asurvy, which is used to evaluate CBI course. It consists of two sections: pedagogical aspects and technical aspects. Survey Questionnaires are posted on Survey Console at this link:

http://www.surveyconsole.com/console/TakeSurvey?id=559030

Evaluation in educational technology

This post present two evaluation studies in educational technology. The first study about the evaluation methodology in education technology in general . The second study focuses on the evaluation of specific technology features.


First study


  1. Title:The Integrated Studies of Educational Technology: A Formative Evaluation of the E-Rate Program

  2. purpose: This study focused to answer these two main questions:
  • To what extent is the E-Rate helping to equalize access to the types of digital technology eligible for program discounts?

  • Are schools and teachers able to use the technology that E-Rate supports? How is it being used in the classroom?

3.instruments:

  • ISET surveys conducted during school year 2000–2001, including the following surveys:
    - a survey of technology coordinator in all 50 states and the District of Columbia,
    -The Survey of District Technology Coordinators
    -The Survey of School Principals
    -The Survey of Classroom Teachers.
  • E-Rate administrative data covering all E-Rate applications and funded commitments through January 2000.

4. Reference:

J. Puma.M, D. Chaplin.D, M. Olson.K, C. Pandjiris.A. (October 2002). The Integrated Studies of Educational Technology: A Formative Evaluation of the E-Rate Program. The Urban Institute. Washington, DC 20037-1207.


Second Study

  1. Title: Statistics E-learning Platforms Evaluation: Case Study
  2. purpose: This study aims to examine the effect on using e-learning platforms in statistics course, through the study course, gender, previous experience with e-learning, understanding statistics, study level, structured format, and self-study, the flexibility and freedom in dealing with the course, interactive environment and e-learning problems.
  3. Evaluation of e-learning platforms Features:
  • Flexibility: it is important for students accessing system at any point. This feature is evaluated using this question: “Do you think that e-learning platforms give you the freedom and flexibility to work with the course? For example MM*Stat or moodle”. The question is re-classified into 4 categories: Yes, partial, no, I do not know.
  • Structured format: A well-designed e-learning platform can help students learning by help them to develop their insights without getting bogged down in the mathematics. The question was to the students: What is your opinion about the structured format of e-learning platform which you have used? This variable is re-classified into 4 categories: Good, acceptable, bad, I do not know.
  • Self-study: E-learning platforms offer the possibility to students to learn alone. The question was to students: Do you think that e-learning platforms improves and increases your self-study? This variable is re-classified into 4 categories: Yes, partial, no, i do not know.
  • Interactive environment: The question was to the students: What are your opinions about the interactive environment of e-learning platforms? This variable is re-classified into 4 categories: Good, acceptable, bad, I do not know.

4. Reference:

Ahmad .T, H¨ardle.W.( August 31, 2008)Statistics E-learning Platforms Evaluation: Case Study _ CASE - Center for Applied Statistics and Economics Humboldt-Universit¨at zu Berlin. Spandauerstrasse 1, 10178 Berlin, Germany. Retrived on 14/2/2008 from: http://www.econbiz.de/archiv1/2008/58778_statistics_elearning_platforms.pdf

Note: Find PowerPoint presentation about these two studies in SlideShare.net in this link

http://www.slideshare.net/u065932/evalution-in-et