PRE2017 3 Groep8: Difference between revisions
No edit summary |
|||
Line 215: | Line 215: | ||
| ❓ | | ❓ | ||
|} | |} | ||
=== Research on existing open source systems === | === Research on existing open source systems === |
Revision as of 11:11, 4 March 2018
Important Links
- Planning
- Google Drive folder (read only)
- Coaching Questions Group 8
Information about ideation
Problem Statement and objectives
Online learning refers to methods of learning that employs online educational technologies. The University of Phoenix was the first to employ online learning in 1989. Since that time, the popularity of online learning systems has greatly increased. In the school year 2013-2014, 75% percent of all United States district offered online or blended courses for K-12 students (Connections Academy, n.d.).
Even though online learning systems are used by many students, there are still challenges regarding these learning methods. Challenges in online learning include keeping students motivated, increasing efficiency of learning and providing insights into learning points for students.
The objectives of this project are:
- To evaluate the current challenges in online learning.
- To evaluate the factors influencing the adoption of online learning systems.
- To evaluate the current approaches to online learning.
- To evaluate the effects of different learning styles on learning.
- To develop an online learning system that applies the knowledge acquired from the previous four objectives.
Who are the users?
Our research focuses on improving the quality of online education, where the focus will be on creating an online learning system that can be used by middle/high-school students. The online learning system that we propose will gather data regarding the performance of students in order to personalise their learning experience. Considering this system will be a part of an existing education system, it will have to be integrated in the existing logistics of schools. For instance, teachers should be able to review the performance of students.
What are the user requirements?
For the students it is important that the level of questions is at the correct level for them. If the questions are too easy they will get bored and if the questions are too difficult they will get demotivated.
For the teachers it is important to have insights into the results of their students so that they can judge the progress of their students.
Both for teachers and students it is important that the user interface of the assistant is clear. For students it needs to be clear what they are expected to do, and feedback needs to be provided when they submit a wrong answer. For teachers it should be easy to judge the progress of their students.
The most important requirement for the management of a school is that the online learning system can be integrated into the current style of education that is provided at the school. This means that the online learning system should be highly adaptive so that it can be adjusted to the specific needs of the specific schools it is implemented in.
Approach, milestones and deliverables
The group will be divided in parts. One part will focus on creating the report and making suggestions and recommendations for implementation in the learning system. The other part will focus on developing the teaching assistant.
The team will first create a minimal usable product, which implements all requirements of the user. After this milestone is reached, the system will be tested and suggestions for improvement will be made. These will be implemented in the second (final) phase, which forms our second milestone.
The team sets out to create three deliverables:
- A working prototype; an online learning system which enables personalised learning.
- A report detailing:
- Our findings on the application of flow theory in education, and recommendations for applying this knowledge in the prototype.
- Our findings regarding challenges and good practices in online learning systems, and recommendations for applying these findings in the prototype.
- A presentation in which the aforementioned report is discussed and the prototype is presented.
Who does what?
In general group 8 works in a shared google drive folder (see important links). Our planning is specified in a seperate file in the google drive folder. This planning document specifies for each task who is responsible and the required timeframe in which the tasks need to be completed. The are also some reoccurring tasks that are assigned to a group member. Please note that this wiki will be updated weekly to reflect the final state of the google drive folder 18 hours before the weekly meeting.
- E-mail (responsible for e-mail contact with lecturers): Wouter
- Wiki updater (responsible for updating the wiki on time): Mitchell
- Secretary (responsible for taking notes of feedback sessions): Nikki
References
- Connections Academy. (n.d.) “Infographic: Growth of K-12 Digital Learning.” Growth of K-12 Online Education Infographic, https://www.connectionsacademy.com/news/growth-of-k-12-online-education-infographic.
USE aspects
User
The system has two primary groups of users: the students and the teachers. This system can be very desirable for the teacher as it can give detailed insight into the performance of students. Using this information the teacher can figure out more easily what topics he should focus on more in lectures, and which students should receive more attention regarding questions about the course material. The system can also be very desirable for students, as it can help with learning how to deal with course material; e.g. how they should plan their homework and make the most efficient use of their available time to learn as much as possible. Furthermore, because the system uses flow theory education may become less boring and more engaging to students. This in turn may motivate them to strive to complete more difficult education programs. To summarize, when it comes to user aspects of the system, it can make education more personal and better adjusted to the needs of the students.
Society
As was discussed in the section on user aspects of the technology, the main benefits of the system entail that high school education can become more personalized to the needs of the students. If the system succeeds in making education less boring and more engaging it might be possible that more students are able to come by in higher forms of education. As such the general level of education in society may increase, which is beneficial in a highly automated information society.
Enterprise
The system can have large implications on the enterprise regarding education. If the system is used by many schools, then the media companies that write text books may very well adapt course material to be more compatible with this system. Schools, as an enterprise, need to adapt their infrastructure such that the system can run in tandem with the school’s remaining digital infrastructure, as such schools will have to invest time and money in integrating the system into their infrastructure. It depends on a school to school basis whether or not the system will be put to use, some schools might find the costs for setting up the system too much compared to the increase in student performance that the system can deliver. Third party companies might step in and host the services that the system requires, such that it becomes easier and/or cheaper for schools to start using it, as they no longer have to host the service themselves.
SotA: Summary of literature study
We conducted a literature study in order to gain a better understanding of concepts relevant to our problem statement. We identified several topics which we deemed interesting. The summary of the literature study focusses on five distinct topics.
Impact of Literature Study
Based on the literature study that has been done, several items have been deemed important to include in our design. The impact of each of the topics which we researched is described below.
Flow Theory
Wesson and Boniwell (2007) describe seven conditions for getting in the flow. The following items are deemed important to include in the design of our learning assistant:
- Balancing challenge and skill: for our system this means that the level of the questions will be adapted to the individual’s capabilities with regard to certain topics.
- Having clear goals: for our system this means that a clear goal needs to be stated that a student can work towards by working with the system. For instance, for mathematics homework a goal could be to be able to apply the concepts taught in a certain chapter of the mathematics book that they use.
- Receiving clear and immediate feedback: for our system this implies that the student should be able to read their feedback immediately after answering the question. In an ideal system this feedback would also be adapted to the answer that the students has provided.
All three of these items are conditions for people to get ‘in flow’, enabling them to focus better on tasks and perform better.
Education by Human Teachers
Moran and Malott (2004) describe two important teaching models, the Direct Instruction (DI) teaching model and the Precision Theory (PT) teaching model. Both of these teaching models, as well as combinations of the two, have been shown to have a positive effect on the learning process, which is why we aim to include items from both models in our system.
The DI teaching model focuses on the importance of repetition and continuous practice of topics. Based on this teaching model we propose that our system should not entirely omit questions on topics that students scored well in in the past, instead it should ask questions on these topics that match the students level.
The PT teaching model allows students to reflect on their own work. Based on precision theory we propose to allow students to indicate the level they think they are at regarding a specific topic. The level indicated by the student will be taken into account by the system to select the initial difficulty of questions. The difficulty will still be adapted as a result of the student’s performance.
Learning Styles
During our research into learning styles we found that much of the research did not lead to conclusive results, and that there is a lack of evidence for the effectiveness of learning style models. This is why we will not adapt our system to the learning style of students. Research has indicated, however, that the teaching style should match with the topics that one is trying to teach.
Gamification
As Nicholson (2015) describes, reward based gamification (akin to operant conditioning) can lead to short term improvements, but different game-elements should be used to facilitate long term improvement. Gamification should also not be permanent.
For our system this means that when the students starts with a course, reward based gamification can be applied. As the student masters the easy level of homework we need to give them more freedom and enable them to find intrinsic value in the subject material. It is at this point that we make the gamification less reward-based (sparser points, less emphasis on level, etc.) but move on to meaningful gamification (using Nicholson’s RECIPE; Play, Exposition, Choice, Information, Engagement, Reflection). Eventually (when the student has found intrinsic value in the course material and is adequate at the topics presented in the course) the student will find that the gamification elements no longer add to the learning process of this course. As such we need to ease the student into homework that is no longer gamified. We could ask the student what they felt about their last homework session, and use that information, combined with their performance metrics to find out what kind of gamification we need to apply for the next homework session.
Using the work of Huang et al. It becomes important to thoroughly define the following constructs in order to guide the design of the system:
- Understanding target audience and context. This will change depending on the type of school that the system is designed for (for instance high school or elementary school, or a school for special needs students), another important factor for defining the context entails the study program. All in all the team should look thoroughly in order to understand the target audience and context of use as good as possible.
- Defining learning objectives (this will change on a course to course basis)
- Structuring the experience. Regarding learning, the experience could be structured based on the topics discussed during the course, as well as the difficulty level of subtopics.
- Identifying resources
- The application of gamification elements can follow based on Nicholson's work (2015).
From the work of Arnold (2014) we find that it is beneficial to figure out what type of gamer the user is. Not all gamer categories like the same game elements, therefore not all types of gamification may be appreciated by the user. We could for instance ask the user when they make their profile what types of games they like to play and why they like to play them, and determine a gamer type based on that information.
Online Learning Systems
The research of Chickering et al. (1987) gives good insight into practices that an online learning system should follow. When applied to intelligent adaptive online learning systems, the practices could be implemented like this:
- Maintain contact between the student and the faculty. This means that the online learning system should be a part of the education process, not a replacement for face-to-face teaching. Lectures and question hours need to be maintained next to the online system. One way to enhance this functionality would be to allow students to ask questions to the teachers, or make a suggestion that a topic receives extra attention in the next lecture. Using these techniques (as well as feedback on assignment performance) the bar is lowered for shy students or students that have difficulty in communicating to interact with their faculty.
- In order to facilitate active learning, the system should be able to produce and handle questions of varying nature. This could be facilitated by implementing projects (larger assignments that students optionally tackle in groups) or peer critiques.
- The system can facilitate good feedback if it is able to quickly and accurately give feedback regarding the performance of the student. The system should not only indicate that the answer was wrong, but it also has to indicate where exactly issues pop up in the answers of the student. By using these performance metrics teachers can adapt the course, in which case changes to the course can also be considered feedback.
- The system facilitates time management by adaptively choosing which homework is suitable for a student. Instead of working endlessly on easy exercises and not having enough time to work on the most difficult ones, the system ensures that students spend their time well. For elementary and high school students it may be beneficial to let the system help the student in planning when they need to work on their homework.
- The performance data gathered by the system can help the education program make extra efforts to help students progress through the course material. The system can communicate expectations that the school has by setting up milestones that students have to fulfill.
- The system should communicate clear guidelines for participation to the student.
Future of e-Learning
The system should be designed in such a way that even students who do not possess outstanding computer skills can use all the relevant components of the system. Although the problem of lower computer skills is decreasing as more children use these technologies, the design of the system should be inclusive enough that outliers of this trend can also use the system seamlessly. The design of the system should also be inclusive for teachers who do not possess outstanding computer skills.
It is of utmost importance that the system is reliable. As Pollock (2018) describes, ease of access is highly regarded by students. Teachers are interested in capabilities that entail information gathering. To summarize, the UI presented to students should be easy to use, and it is of utmost importance that the system is up and running at all times (minimal downtime for maintenance, and good recovery from errors). The information gathered about the students' performance should be presented in such a way that teachers can quickly discover where students are having issues. Perhaps a way to do this is to generate summary reports about the performance of the student. When the teacher accesses the profile of the student this summary is the first thing they see. For initial versions of the system, it can prove useful to interview the teachers who will use the system in the future.
In order to ascertain maximum ease of use, the system should integrate nicely into the other systems offered by the school. If the integration of the system is subpar (for instance, it takes unnecessary steps for teachers to access information) then the system may have adverse effects on the quality of learning. It goes without saying that teachers will be less likely to want to use the system if it is hard to use.
An important feature that the system should support entails the ability to quickly skim the contents of the entire course. In our system, students should be able to nudge the system so that they can receive assignments based on topics that the student wants to train again, even if the student already has a sufficient score for this topic.
The system should also purely focus on improving the homework experience. Face-to-face lessons should not be replaced by the system.
Outline of system functionality
NB: this is a description of a fully featured version of the system (it shows how the system would look like if it were to be implemented for an entire school), some of these items do not apply to the prototype (which focuses only on one course). Certain items on this list are too difficult to make in the limited time/programming experience we have, these items are recognizable as they include (advanced) in their description. Other features that are not necessary for core system functionality are written in italics. Even though the prototype will not include all the items featured here, it might be worthwhile to show what our intentions/vision for the system is. The lists below show which subcomponents are needed in the system, and what tasks each subcomponent needs to fulfill.
Server
- Maintains student profiles, which contain:
- Username
- Password (unique to this service)
- Real name of student (so that results can be linked to the school’s administration)
- Interests (advanced)
- Courses that the student partakes in
- Performance on topics that are handled in courses, for example:
- Course: history (4th year VWO)
- Knowledge on Industrial revolution (2/10) -> use introductory homework (amount: relatively many)
- Knowledge on renaissance (10/10) -> use advanced questions (amount: relatively few)
- Knowledge on stone age (6/10) -> use medium questions (amount: numIndRevolution > n > numRenaissance)
- Course: history (4th year VWO)
- Homework list (based on performance measure outlined above)
- Maintains course profiles, which contain:
- The topics that the course consists of (e.g. the chapters of a math book)
Study material
- Contains categorized assignments, ranging from
- 1. Introductory
- 2. Medium
- 3. Advanced
Client
- Web-based / app-based
- Student has username and password (to access/update profile stored on server)
- Student can input homework answers:
- Predetermined answers, such as calculations (math, physics, chemistry, etc.) or multiple choice questions
- Flexible answers: essay assignments (language, history, etc.)
- Student receives assignment list of homework based on the topics that are taught in class. The difficulty and amount of assignments per topic depends on the student’s performance on previous homework, and comments made by the student.
- The student can let the system know if he/she finds a particular topic difficult. Whether a topic is easy is determined by the system, as such the student can override the system’s decision and be given more homework on one topic, but it is not possible for the student to circumvent making homework by notifying the system that all the homework is easy.
- Considering the system targets middle/high-school students, it may be useful if the system can display useful tips regarding studying (e.g. make material easier to recall), the system could also include a planning service, where the student learns how to plan their homework to get everything done on time.
AI
- (advanced) If multiple versions of an assignment exist (for instance math problems with a story), then the assignment description is used that is most in line with the interests of the student.
- The performance of a student is based on:
- The difficulty of the exercise.
- The time it takes the student to complete the exercise.
- To prevent the system from making mistakes by assuming that a student has difficulty with a particular assignment when in reality they are just slow readers or unfamiliar with computers, a calibration of their typing skills will be included in the beginning.
- The number of completed exercises as compared to the total number of exercises.
- The number of hints used when attempting to solve a particular exercise.
- Adjusts the difficulty and amount of homework depending on the individual performance of the student
- The performance of the student is visible to the school administration and the teacher.
Summary
The system is an intelligent online learning system that is used to aid traditional teaching. It allows students to have a more personalized education. Students attend classes like they normally would, but the system keeps track of the student’s performance, and adjusts the difficulty and amount of homework depending on the individual performance of the student. In this way the student can more readily improve at topics that they find difficult, while knowledge on topics that they find easy is maintained. The performance of the student is visible to the school administration and the teacher. In this way the teacher can more easily find out if the class is struggling with a topic discussed in the course, and it allows the teacher to understand which students need the most help.
Progression of prototype
Client front-end implementation
https://dev.meesvandongen.nl/teachingassistant/
Feature | Completion |
---|---|
username and password | ✔️ |
Multiple Choice answers | ✔️ |
Text Input answers | ✔️ |
Advanced text input answers (essay) | ❌ |
Homework list | ✔️ |
The student can let the system know if he/she finds a particular topic difficult | ❌ |
System can display useful tips regarding studying | ❌ |
Planning service | ❌ |
Skipping Questions | ❓ |
Topics | ❓ |
Research on existing open source systems
Open source learning software: Moodle. Has plugin system.
A discussion on the moodle boards about our subject: CAT learning (computer adapted test): https://moodle.org/mod/forum/discuss.php?d=159682 Some papers are mentioned here, which could prove useful in our design.
Research on the decision making system
We must match a user with a question. For this we must use the knowledge we (the system) have.
What we know of the user:
- Some measure for how skilled they are at certain types of questions based on previous questions answered
- How long they have been answering questions for
- At which speed they are answering questions
- How good they are at answering questions right now (relative to their estimated skill)
The last 3 things could be used to detect that the user is currently in a state of flow and can for example be asked more difficult questions.
What we know of each question:
- A category (set manually or automatically) (We can limit our system to 1 category for the sake of the prototype)
- It is possible to have multiple categories per question (each individual thing a person has to know to answer a question)
- A measure of difficulty (based on other people answering the question) . Having this automated has the additional benefit of telling us which questions people find most difficult. The downside is that the system has to learn the difficulties before it becomes effective which can be partially solved by starting with a default value.
- The speed of the question being answered relative to the ratio of it being answered correctly: for example a trick question would be answered incorrectly a lot despite being answered relatively fast. This can also be used to see if maybe a user should just take more time to think or if they should have been able to answer it sooner.
For a person to have mastered the subject in question they must be able to answered most question correctly so our performance measure for the users should be based on this.
Our system could work something like this:
- The score a user has for every category starts at 100
- When a user answers a question correctly/incorrectly their score goes down/up based on the difficulty score of the question.
- When the score for a category reaches 0 they no longer get asked questions in that category
The above things are only the basics, how a questions gets chosen and how the scores change is something that there is no 1 obvious choice for so from here the proposed rules are all options and do not all have to be chosen
The score of a question can be the percentage of times it gets answered correctly. With this we can propose the following system for choosing the questions:
- Choose a question with a category for which the users score is the lowest , ask multiple questions with this category in a row.
- Do not choose a question that has been answered correctly before
- Do not choose a question that has been answered incorrectly recently
- Specifically ask a question again that was answered incorrectly earlier (when the user should definitely know the answer now)
- Choose the question for which the score is closest to the score of the user
- If a question with multiple categories is answered incorrectly ask questions that have only one of these category to find out which category(sub problem) is the problem, if all of these are answered correctly ask another question with all these categories combined.
For Changing the users score:
Correct answer:
- When a question is answered correctly subtract some constant c from the score
- Subtract additional points for multiple correct answers in a row
- Subtract more points for a more difficult question being answered
- Subtract more points for a quick answer (relative to the users average speed compared to the global average for other users for that question)
Incorrect answer:
- Add more points of score recently went up by a lot (this question was much more difficult than the previous ones)
General notes:
- The actual numbers should be fine-tuned based on how long it should take to master questions in a category.
- When looking at the speeds for answers being given we should filter out outliers (someone leaves a question open for 1 hour while doing something else)
- Some things the system does should be visible to teachers. Things like the questions and questions categories which individual students find most difficult as well as the ones that are most difficult universally among all students are useful for teachers to know.
- Faulty questions will be answered ‘correctly’ approximately 0% of the time which means they can be filtered out automatically.
From literature:
Search terms?
- Computer adaptive test (Mentionioned on the moodle forum (2010) https://moodle.org/mod/forum/discuss.php?d=159682 )
- The forum mentions http://carla.umn.edu/assessment/CATfaq.html and http://echo.edres.org:8080/scripts/cat/catdemo.htm (Lawrence M. Rudner)
The second link describes a system that works similarly to the one described above. Their system works as follows:
- All the items that have not yet been administered are evaluated to determine which will be the best one to administer next given the currently estimated ability level
- The "best" next item is administered and the examinee responds
- A new ability estimate is computed based on the responses to all of the administered items.
- Steps 1 through 3 are repeated until a stopping criterion is met.
It describes a more complicated way of choosing questions which is backed up with statistics (which question should theoretically give us the most information about the users skill level). This does not take into account the psychological effects of the difficulty of the questions and is geared towards determining the skill level of people as opposed to increasing it.
I propose we use an established system as described above to determine skill levels but adapt it so that it also will try to train the students on their weak points and motivate them by asking the right questions at the right times. The speed at which the questions get answered is something that we can also consider taking into account (speed is not considered in the mentioned existing system).
Testing the prototype
In the test plan we describe how we want to test the prototype. Testing consists of two phases. First, we want to hold a focus group with teachers of a local high school in order to refine our concept. For the second stage of testing, we want to let students of a local high school interact with the system. These plans are not yet final, as such, they are subject to change. In case we are not able to test the system at a school we will instead try to hold an experiment among students at the TU/e.