PRE2017 3 Groep8: Difference between revisions
mNo edit summary |
(→Testing the prototype: added contents from files "test plan" and "focus group scheme" from google drive.) |
||
Line 520: | Line 520: | ||
= Testing the prototype = | = Testing the prototype = | ||
In this section we describe how we want to test the prototype. Testing consists of two phases. First we want to hold a focus group with teachers of a local high school in order to refine our concept. For the second stage of testing we want to let students of a local high school interact with the system. These plans are not yet final, as such they are subject to change. In case we are not able to test the system at a school we will instead try to hold an experiment among students at the TU/e. | |||
=== 0LAUK0 - Focus group scheme draft === | |||
'''Participants: ideally 1 teacher of every course, or most math teachers.''' | |||
'''Time: TBA''' | |||
'''Location: TBA''' | |||
'''Date: 19-03-2018 - 23-03-2018''' | |||
'''Introduction''' | |||
''[Welcome participants]'' | |||
Let us first introduce ourselves, we are students from Eindhoven University of Technology. We are currently working on the project "Robots everywhere". Within this project we attempt to do something new and interesting with the research done on robotics and artificial intelligence. Our team has chosen to research an interactive online learning system that can adapt the homework based on the strengths and weaknesses of students. | |||
In this focus group we want to discuss the possibilities of an adaptive online learning system that can aid your students in making their homework. This system will use some form of artificial intelligence to measure the performance of students, and to decide what topics need the most attention in their homework. You, as teachers, can use these performance metrics to gain detailed insight in the performance of every student. This can also allow shy students to more easily show that they need attention when it comes to certain topics that are taught in your courses. The aim of this system is to motivate students to make efficient use of their time when making homework, the system will supplement traditional lessons. | |||
Moving on to the planning of today. In this focus group you will discuss topics that are relevant to this system. We are interested in a discussion amongst our participants, to find out what you really find important, so we can use that information to improve our design. Our role as researchers is to moderate this discussion. | |||
Before we continue we would like you to sign these informed consent forms. Notes will be made of this discussion, as well as an audio recording. During analysis of this data your personal information will be removed. Everything you say here will be treated confidentially. On your request we will destroy the data after the research is completed. | |||
We cannot stress enough that your participation in this focus group is completely voluntary. You have the right to withdraw from the focus group at any time. We will gladly answer your questions regarding the research. There may be certain questions which we cannot answer before the main discussion starts, but we will come back to them during the debriefing. To confirm that you agree to these conditions we kindly ask you to sign these forms. [hand over informed consent forms] | |||
''[Misschien moeten we ook nog kijken naar een manier om makkelijk de deelnemers voor ons zelf te identificeren, met naambordjes bijvoorbeeld.'' | |||
'''Topics to discuss''' | |||
* ''[moet nog uitgebreid worden]'' | |||
* Opinions toward using such a system to enhance face-to-face teaching | |||
* How would you like this system to work? | |||
'''Debriefing''' | |||
=== Test plan === | |||
==== Research question ==== | |||
Does the use of an adaptive learning agent affect students’ enjoyment and learning efficiency when compared to the use of a non-adaptive learning agent? | |||
==== Hypothesis ==== | |||
The adaptive learning agent (ALA) is more pleasant for students to work with and improves the students’ learning efficiency when compared to an non-adaptive learning agent (NLA). | |||
==== Research objective ==== | |||
To investigate the effects of using an adaptive learning agent — that is designed according to current research on flow theory, teaching models, gamification and e-learning — on teaching high-school students. | |||
==== Method ==== | |||
Required is a self-study period of at least 35 minutes. | |||
The experiments start off with a short 5-minute test to assess the students’ initial level. This test contains more questions than the students can possibly finish in the five minutes. The tests will be scored based on how many answers the students gave and how many of those were correct. After that, the students will be given time to practice the subject matter. A third of the class will be given the ALA, another third the NLA. Both will be programmed with the subject matter relating to the last instruction. Neither of these groups will know to which group they belong. The remaining third will simply be given the book. After a fixed time, all students will be given another test, similar to the one they were given in the beginning. At the end, all students that used the learning agents will be given a short questionnaire about their experience and where they can indicate what they think the system lacks. | |||
==== Schedule ==== | |||
* 5-minute instruction | |||
* 5-minute test on the subject matter | |||
* Practicing (total study time minus 20 minutes) | |||
* 5-minute test on the subject matter | |||
* 5-minute reflection | |||
When testing is done using an online questionnaire, the participants will be separated in two groups, one with the ALA, one with the NLA. The schedule remains the same, except the instruction will simply be written in text, and no fixed time interval has to be taken into account. The same goes for the reflection. Participants ‘are done when they are done’. |
Revision as of 21:45, 3 March 2018
Important Links
- Planning
- Google Drive folder (read only)
- Coaching Questions Group 8
Information about ideation
Problem Statement and objectives
Online learning refers to methods of learning that employs online educational technologies. The University of Phoenix was the first to employ online learning in 1989. Since that time, the popularity of online learning systems has greatly increased. In the school year 2013-2014, 75% percent of all United States district offered online or blended courses for K-12 students (Connections Academy, n.d.).
Even though online learning systems are used by many students, there are still challenges regarding these learning methods. Challenges in online learning include keeping students motivated, increasing efficiency of learning and providing insights into learning points for students.
The objectives of this project are:
- To evaluate the current challenges in online learning.
- To evaluate the factors influencing the adoption of online learning systems.
- To evaluate the current approaches to online learning.
- To evaluate the effects of different learning styles on learning.
- To develop an online learning system that applies the knowledge acquired from the previous four objectives.
Who are the users?
Our research focuses on improving the quality of online education, where the focus will be on creating an online learning system that can be used by middle/high-school students. The online learning system that we propose will gather data regarding the performance of students in order to personalise their learning experience. Considering this system will be a part of an existing education system, it will have to be integrated in the existing logistics of schools. For instance, teachers should be able to review the performance of students.
What are the user requirements?
For the students it is important that the level of questions is at the correct level for them. If the questions are too easy they will get bored and if the questions are too difficult they will get demotivated.
For the teachers it is important to have insights into the results of their students so that they can judge the progress of their students.
Both for teachers and students it is important that the user interface of the assistant is clear. For students it needs to be clear what they are expected to do, and feedback needs to be provided when they submit a wrong answer. For teachers it should be easy to judge the progress of their students.
The most important requirement for the management of a school is that the online learning system can be integrated into the current style of education that is provided at the school. This means that the online learning system should be highly adaptive so that it can be adjusted to the specific needs of the specific schools it is implemented in.
Approach, milestones and deliverables
The group will be divided in parts. One part will focus on creating the report and making suggestions and recommendations for implementation in the learning system. The other part will focus on developing the teaching assistant.
The team will first create a minimal usable product, which implements all requirements of the user. After this milestone is reached, the system will be tested and suggestions for improvement will be made. These will be implemented in the second (final) phase, which forms our second milestone.
The team sets out to create three deliverables:
- A working prototype; an online learning system which enables personalised learning.
- A report detailing:
- Our findings on the application of flow theory in education, and recommendations for applying this knowledge in the prototype.
- Our findings regarding challenges and good practices in online learning systems, and recommendations for applying these findings in the prototype.
- A presentation in which the aforementioned report is discussed and the prototype is presented.
Who does what?
In general group 8 works in a shared google drive folder (see important links). Our planning is specified in a seperate file in the google drive folder. This planning document specifies for each task who is responsible and the required timeframe in which the tasks need to be completed. The are also some reoccurring tasks that are assigned to a group member. Please note that this wiki will be updated weekly to reflect the final state of the google drive folder 18 hours before the weekly meeting.
- E-mail (responsible for e-mail contact with lecturers): Wouter
- Wiki updater (responsible for updating the wiki on time): Mitchell
- Secretary (responsible for taking notes of feedback sessions): Nikki
References
- Connections Academy. (n.d.) “Infographic: Growth of K-12 Digital Learning.” Growth of K-12 Online Education Infographic, https://www.connectionsacademy.com/news/growth-of-k-12-online-education-infographic.
SotA: Summary of literature study
We conducted a literature study in order to gain a better understanding of concepts relevant to our problem statement. We identified several topics which we deemed interesting. This summary of the literature study focussess on five distinct topics.
Flow theory and education
Karen Wesson & Ilona Boniwell (2007) describe flow in the following way: Being ‘in flow’ or ‘in the zone’ enables individuals to focus on tasks more fully and to maximise performance . They describe conditions that should be met to get people in flow as well as ways to meet these in the context of coaching. Their list of conditions is as follows.
- Having clear goals
- Balancing challenge and skill
- Importance placed on doing well in an activity
- maintaining goal congruence
- Receiving clear and immediate feedback
- Increasing autonomy
- Increasing absorption
Sheehan & Katz (2012) apply flow theory to physical education. They mention Csikszentmihalyi’s (1975) eight elements which they describe as follows:
- Balance between the difficulty of an activity and an individual’s proficiency. Is there an achievable perceived challenge?
- Apparent goals. Is there a clear objective that distinguishes pertinent from immaterial information?
- Immediate feedback. Is there personalized feedback being received in a timely manner?
- The harmony of action and awareness. Is there awareness of what’s happening without thinking about the need for this awareness?
- Focused concentration. Is the person able to concentrate on a limited stimulus field?
- Decreased self- consciousness. Is there awareness of internal processes and less emphasis on one’s self image (while maintaining a sense of their physical reality)?
- Perception of Control. Is the person capable of adequately achieving the prescribed task and less concerned about perfection?
- Decreased awareness of time. Is there a feeling that the importance of time is diminished (losing track of time)
Rodríguez-Ardura and Meseguer-Artola (2017) analysed the way flow relates to other constructs such as challenge and Control in the context of e-learning, this is done using a questionnaire distributed among the students of an established pure-online University. This analysis could teach us both what is needed to achieve flow as well as what the benefits to achieving flow are.
Ghan & Deshpande (1994) propose a model to examine the optimal flow in human-computer interaction, this could tell us which aspects are more important for reaching optimal flow. This paper also studies the impact of task scope which is the motivating potential of a job. It is noteworthy that this paper is from a time where human-computer interaction was relatively new.
Education by human teachers
A large-scale initiative to develop more efficient teaching methods in the USA called Follow Through resulted in the development of the Direct Instruction (DI) teaching model, as well as a monitoring method called Precision Teaching (PT) (Moran & Malott, 2004).
DI aims to improve learning outcomes by increasing clarity in the learning process (Kider & Carnine, 1991). On top of that, DI’s strongest focus is on repetition and continuous practice. At the beginning of the program, students are tested to determine their current skill level, and are then placed in groups of students with the same skill level. Only 10% of each lesson is new material, the rest is repetition and practice of previous study material (National Institute for Direct Instruction (2018).
In PT, the teacher has more of a guiding role and the students reflect on themselves more (Lindsley, 1992). The learning curve is monitored using a so-called Standard Celeration Chart (SCC), which allows comparison of the learning curve of different students for the same task, or comparison of different tasks for the same student. This is thanks to the standard format for the SCC, in addition to the logarithmic scale which allows many numbers to be compared in their own order of magnitude (FHSS Information Architect, 2017). Both methods have given promising results, also on long term after following regular teaching for several years (Becker & Gersten, 1982)(Gersten, Keating & Becker, 1988). The combination of the two has also been shown to have a positive effect on the learning process (Binder & Watkins, 1990).
Learning Styles
Learning styles aim to account for different ways of learning that individuals employ. According to these theories, people can be classified according to the way in which they learn, their ‘learning style’. Different learning style models have been identified over the years, Coffield, Moseley, Hall & Ecclestone (2004) identified a total of 71 different learning style models, and determined five families of learning styles: Learning styles that are constitutionally based. Learning styles that reflect cognitive structure. Learning styles as a component of personality types. Learning styles as flexible learning preferences. Models that move on from learning styles towards learning approaches, strategies and orientations.
Moving from top to bottom along this list we start with theories that belief that learning styles are fixed and move towards models that are based on dynamic learning styles based that take into account personal and environmental factors.
While much research has been done regarding learning styles, and many schools implement them, much of the research has not lead to conclusive results. There is also a considerable amount of criticism of applying learning styles in education. Many researchers have found that there is a lack of evidence for the effectiveness of learning style models (Lilienfeld, Kynn, Ruscio & Beyerstein, 2011)(Rohrer & Pashler, 2012), or that their effectiveness is a self-fulfilling prophecy (Gurung & Prieto, 2009). Glenn (2009) states that instead of adapting the style to the students, the style should be matched with the content. He states that some concepts are best learned through hand-on work, while other are best taught through lectures or discussions.
Future of E-learning
Among teachers, it is expected that cost will have the largest impact on education. Furthermore, teachers regard student retention to be the biggest problem in online courses as opposed to face-to- face teaching. (Allen, 2015) Lower computer skills are also found to relate to lower learning outcomes, although the impact of this problem is decreasing as more and more children have access to and thus gain experience using a computer or computerised device. (Welsh, 2003) Students find ease of access and use to be the most important beneficial aspect of e-learning, and teachers find the gathering of information on the students’ data and progress most useful. Both want it to run smoothly and stably. (Pollock, 2018) The necessity to solve this latter problem is also backed by research that finds that it is detrimental to the effectiveness of the e-learning course (Welsh, 2003).
Another big problem in e-learning reception is that it often does not support peer-to- peer networking. (Wang, 2010 & Welsh, 2003) The e-learning system is often difficult to integrate and requires thorough planning and discussion between all parties involved before it can be successfully integrated. If this is not done carefully, the e-learning system can actually have adverse effects. (Delgado-Almonte, 2010) This also leads to the infrastructure and ease of use, as well as the ability to change the system if needed to be very important (Welsh, 2003).
E-learning should also provide learners with an incentive to complete the course. However, at the same time the course should allow learners to quickly skip through the information in a course in order to find and (re-)learn only one particular aspect of the course. Lastly, the ease of e-learning may get management under the impression that e-learning is all that is required for a student to learn the subject matter. E-learning, however, should not completely replace all studying, but always be backed by face-to- face lessons (Welsh, 2003).
Gamification
Deterding et al. (2011) aim to investigate the origins of gamification and how it relates to serious games, pervasive games, alternate reality games and playful design. The paper suggests that gamified applications provide insight into new gameful phenomena that complement playful phenomena. The definition of gamification that is agreed upon states that gamification entails the use of game design elements in non-game contexts. Gamification is a new term for an older phenomenon, several precursors and parallels exist. Already in the early 1980’s (Deterding et al., 2011, p.2) research was performed in HCI to redress routine work using game elements.
Hamari et al. (2014) performed a literature study of peer-reviewed empirical studies on gamification. Their aim was to create a framework for examining the effects of gamification using definitions of gamification and motivational affordances. The paper gives insight in the experiments performed in the peer reviewed studies. Hamari and Huotari stress that gamification should invoke the same psychological experiences that games invoke. Deterding on the other hand argues that affordances in gamified systems should be the same ones that are used in games. The studies that were included in the literature review used any of the following motivational affordances: points, leaderboards, achievements/badges, levels, story/theme, clear goals, feedback, rewards, progress, and challenge. The majority of studies focused on education/learning, intra-organizational systems, and work. But there were also studies on commerce, health/exercise, sharing, sustainable consumption, innovation/ideation, and data gathering. The paper concludes that gamification does appear to work, but that there are caveats. Quantitative research concluded that positive effects only existed in part of the considered relationships between gamification and the studied outcomes. Qualitative research showed that there may be underlying confounding factors that influence the effectiveness of gamification. The authors also state that more rigorous methodologies ought to be used in further research. The suggestions they give may be of use for our project in 0LAUK0.
In Deterding (2012), various views on gamification are presented by people involved in industries were gamification is relevant. Judd Antin, a social psychologist in the Internet Experiences research group at Yahoo! Research, remarks that gamification is a positive trend in that it signals a shift away from pecuniary and instrumental rewards. When done right, gamification can make use of powerful social psychological processes, such as self-efficacy, group identification and social approval to aid long term performance. Unfortunately many modern applications of gamification lack the ability to account for differences in individuals and contexts. Elizabeth Lawley, professor of interactive games and media and founder and director of the Lab for Social Computing at Rochester Institute of Technology argues as well that modern applications of gamification reduce well-designed games to their simplest components. These implementations may fail to engage players, but they might also damage existing interest or engagement with the service or product. She worked on “Just Press Play”, an achievement system for students in interactive games and media at the Rochester Institute of Technology. This system may be relevant for our study for 0LAUK0. Rajat Paharia, founder and chief product officer of Bunchball, describes in his section how his company designs gamified systems. He stresses the importance of context, and that for gamification to work, the goal that is gamified needs to have a core intrinsic value.
Lawley (n.d.) reflects on issues that the first version of “Just Press Play” suffered from (see also Deterding, 2012). Just Press Play is a gamified system designed at Rochester Institute of Technology, meant to help new students find their way around campus and to get them out of their comfort zone to partake in the university’s activities. Just Press Play is an achievement system, the original version used achievements based on internal system triggers (e.g. completing the tutorial), administratively assigned achievements (e.g. a certain percentage of the class manages to finish a difficult course), user submitted content (e.g. photos of things around the campus), collectible cards with a special code on it, and RFID keychains that can be used to receive credit for attending events. Due to technical issues the collectible cards and RFID keychains did not work out properly. In the second version of Just Press Play, RFID tags were replaced with QR code stickers that students can place on (for instance) their campus card or phone, which they can scan at events. Collectible cards are printed offsite and distributed to students after they unlock an achievement. Since the original cards were very popular, there are plans to make a card game using these cards. Privacy aspects and stability of the system was improved, and the categorization of the achievements was modified, as such there are now achievement quadrants (create, learn, socialize, explore).
Nicholson (2015) describes six concepts (Reflection, Exposition, Choice, Information, Play, and Engagement) to help designers implement gamification in a meaningful way. Gamification can help users find personal connections, thereby motivating engagement. Nicholson argues that reward-based gamification (akin to operant conditioning) can lead to short-term improvements, but other game-based elements should be used to facilitate long-term change. He also argues that gamification should not be permanent. Reward-based gamification can be used to ease a user into a certain task, meaningful gamification can be used to strengthen the behavior, but eventually the user will get bored of the gamified system. As such gamification should be designed to ease the user into the real world context of the task.
In Huang et al. (2013), gamification in education is discussed. It is stated that gamification is a specific application of “nudging”. A five-step process is discussed that can help in making a gamified system:
- Understanding the target audience and context
- Defining learning objectives
- Structuring the experience
- Identifying resources
- Applying gamification elements
In understanding the target audience and context it is also important to take into account the length of the learning program, where the program is conducted (class room/at home), if students work in groups (and how large these groups are). There are several common pain points in education that need to be considered:
- Focus: younger students are more easily distracted.
- Skills: students may be demotivated to try when the task is too difficult, the student lacks the skills or knowledge required to complete the task.
- Physical, mental and emotional factors: fatigue, hunger, or emotions are factors that can affect a student’s learning abilities or other pain points.
- Motivation: young adults and adolescents commonly lack motivation.
- Pride: adults may believe they already know what is being taught, they may also choose to study material that is well above their skill/knowledge level. This issue may also occur when the instructor is younger than the students.
- Learning environment and nature of the course: this pain point consists of properties of the course, such as class size and structure of the program.
The paper discusses in an example how a math class can be structured such that gamification could be applied to it, this is very relevant regarding our project in 0LAUK0. Furthermore, a distinction is made between push and complete, where complete entails understanding the concepts in each stage, and push entails the motivation to go to the next stage. Lastly, Huang et al. (2013) also categorizes game mechanics in self-elements and social elements:
Self-elements (complete stage) | Social elements (push stage) |
---|---|
Points | Leaderboards |
Levels | Virtual Goods |
Trophies/badges | Interactive cooperation |
Virtual goods | Storyline |
Storyline | - |
Time restrictions | - |
Aesthetics | - |
Arnold (2014) discusses in his paper among other things Bartle’s four basic categories of gamer, and how these categories are (mis)used in gamification. When making a gamified system it is important to notice that not all gamer categories like the same game elements (e.g. socializers do not care for leaderboards).
- Socializers: more interested in having relations with the other players than playing the game.
- Achievers: competitive and enjoy beating challenges.
- Killers: provoke and cause drama in the scope of the virtual world.
- Explorers: like to explore the geography of the world as well as the mechanics of the game.
Online learning systems
Seven Principles For Good Practice in Undergraduate Education
[Abstract] The Seven Principles for Good Practice in Undergraduate Education grew out of a review of 50 years of research on the way teachers teach and students learn (Chickering and Gamson, 1987, p. 1) and a conference that brought together a distinguished group of researchers and commentators on higher education. The primary goal of the Principles’ authors was to identify practices, policies, and institutional conditions that would result in a powerful and enduring undergraduate education. (Sorcinelli, 1991, p. 13)
In Chickering et al. (1987), Seven Principles of Good Practice in education are laid out. These practices help students learn more effectively. These are:
- Contact between student and faculty. “Faculty concern helps students get through rough times and keep on working. Knowing a few faculty members well enhances students’ intellectual commitment and encourages them to think about their own values and future plans.” Discussion groups are a valuable tool for this.
- Cooperation among students. “Working with others often increases involvement in learning.” This can be accomplished using peer tutoring, group work - possibly in a project setting - or seminars
- Active learning. Active learning is a method in which the student learns by working with the course material. “Students do not learn much just by sitting in classes listening to teachers, memorizing prepackaged assignments, and spitting out answers.” Examples here are exercises, discussions, (team) projects, peer critiques and internships.
- Good feedback. Assess what the student knows, and more importantly what he doesn’t know. Give timely feedback, so that the student can incorporate it. The feedback needs to be frequent. Students should also learn to assess themselves.
- Time management. “Time plus energy equals learning. There is no substitute for time on task. Learning to use one’s time well is critical for students and professionals alike.” Make sure students spend time on a task and that students use their time efficiently. Tools: Mastery learning, contract learning, computer assisted instruction.
- High Expectations. “Expecting students to perform well becomes a self-fulfilling prophecy when teachers and institutions hold high expectations of themselves and make extra efforts.” Communicate the expectations, create programs out of the curriculum.
- Diverse Talents and Ways of Learning. “Students need the opportunity to show their talents and learn in ways that work for them.” Develop multiple ways for students to learn and work.
Principles for Good Practice in Undergraduate Education: Effective Online Course Design to Assist Students’ Success
[Abstract] The purpose of this study was to apply the Seven Principles for Good Practice in Undergraduate Education (Chickering & Gamson, 1991) to online course design to enhance students ’ success in an online course. A survey was created to determine students’ perception of strategies and skills they perceived as important to complete an online course. The survey was created based on behavioral learning, cognitive learning, and social learning frameworks. The responses of the 179 students in this study in an undergraduate Computer Applications in Business course at a large southeastern university were categorized by the Seven Principles . Results of the survey showed the course design strategies and what students valued matched well with the Seven Principles Implications of the study provide evidence that good course design embed s the seven principles to ensure students are successful in the online learning environment. (Crews et al., 2015)
Online design which takes into account these seven principles can be perceived as being a good system by the students using it. The literature review is useful: Disadvantages of online course design as noted by Clark (2003):
- discussions that are not connected in time and seem disjointed;
- lack of clear guidelines for participation;
- lack of engagement in an asynchronous environment;
- difficulty in collaborative online projects; and
- lack of communication with the instructor and other students.
These points should be taken into account when designing an online learning system. Salmon (2002) and Huang (2002) say online systems should focus on:
- access
- motivation
- socialization
- information exchange
- knowledge construction
- interactive learning
- collaborative learning
- facilitating learning
- authentic learning
- student centered learning
Implementing the Seven Principles: Technology as Lever
[Abstract] In March 1987, the AAHE Bulletin first published “Seven Principles for Good Practice in Undergraduate Education.” With support from Lilly Endowment, that document was followed by a Seven Principles Faculty Inventory and an Institutional Inventory (Johnson Foundation, 1989) and by a Student Inventory (1990). The Principles, created by Art Chickering and Zelda Gamson with help from higher education colleagues, AAHE, and the Education Commission of the States, with support from the Johnson Foundation, distilled findings from decades of research on the undergraduate experience. Since the Seven Principles of Good Practice were created in 1987, new communication and information technologies have become major resources for teaching and learning in higher education. If the power of the new technologies is to be fully realized, they should be employed in ways consistent with the Seven Principles. Such technologies are tools with multiple capabilities; it is misleading to make assertions like Microcomputers will empower students because that is only one way in which computers might be used. Any given instructional strategy can be supported by a number of contrasting technologies (old and new), just as any given technology might support different instructional strategies. But for any given instructional strategy, some technologies are better than others: Better to turn a screw with a screwdriver than a hammer a dime may also do the trick, but a screwdriver is usually better. This essay, then, describes some of the most cost-effective and appropriate ways to use computers, video, and telecommunications technologies to advance the Seven Principles. (Chickering et al., 1996)
- Good Practice Encourages Contacts Between Students and Faculty. Technology can be very useful here. Digital questions can graded quicker than for example physical homework, students that are shy or otherwise not able to communicate with the teacher face to face can more easily and safely do so by online communication. language barriers are not as high when people have more time to interpret the questions.
- Cooperation. Same story here, communication between students is improved.
- Active learning. The internet gives a big opportunity for researching into topics. computer software can be used to encourage active learning, through software based homework. Simulation can be done of what is not feasible or otherwise more cumbersome in real life. An example of this is physics simulations.
- Feedback: “Computers also have a growing role in recording and analyzing personal and professional performances. Teachers can use technology to provide critical observations for an apprentice; for example, video to help a novice teacher, actor, or athlete critique his or her own performance.” Next to this, computers can be used to store past performances and later be used by teachers to evaluate growth.
- Time on task: working from home can save student’s time otherwise spent commuting. technology can be used to document time on task and possibly communicate this back to student.
- High Expectations: “Many faculty report that students feel stimulated by knowing their finished work will be “published” on the World Wide Web. With technology, criteria for evaluating products and performances can be more clearly articulated by the teacher, or generated collaboratively with students. General criteria can be illustrated with samples of excellent, average, mediocre, and faulty performance. These samples can be shared and modified easily. They provide a basis for peer evaluation, so learning teams can help everyone succeed. ”
- Diverse talents and ways of learning. Give students who can handle it freedom. Give those who can’t extra attention. Students with similar learning styles, or who need each other for learning can be brought together.
The article also mentions that simply using technology is not enough. It must be in line with the seven principles. Technology should motivate the student, i.e. with materials that are problem oriented, relevant to real world problems or interactive.
Computer-Supported Collaborative Learning in Higher Education: An Introduction
[abstract] The rapidly increasing use of computers in education, and in particular the migration of many university courses to web-based delivery, has caused a resurgence of interest among educators in non-traditional methods of course design and delivery. This chapter provides an introduction to the field of computer-supported collaborative learning (CSCL). First, some of the major benefits are listed. Then, some of the common problems are described, and solutions are either given or pointed to in the literature. Finally, pointers are given to some of the more recent research in this area. (Roberts, 2015)
References
- Allen, I. E., & Seaman, J. (2015). Grade Level: Tracking Online Education in the United States. Babson Survey Research Group.
- Arnold, B. J. (2014). Gamification in education. ASBBS Proceedings, 21(1), 32. Retrieved from https://search.proquest.com/docview/1519057772?pq-origsite=gscholar
- Becker, W.C., & Gersten, R. (1982). A Follow-up of Follow Through: The Later Effects of the Direct Instruction Model on Children in Fifth and Sixth Grades. American Educational Research Journal, 19(1), 75-92.
- Binder, C., & Watkins, C. L. (1990). Precision teaching and direct instruction: Measurably superior instructional technology in schools. Performance Improvement Quarterly, 3(4), 74-96.
- Chickering, A. W., & Gamson, Z. F. (1987, March). Seven principles for good practice in undergraduate education. American Association for Higher Education Bulletin, 39(7), 3–6. Retrieved from http://www.lonestar.edu/multimedia/SevenPrinciples.pdf
- Chickering, A.W.,, Arthur & Ehrmann, Stephen. (1996). Implementing the Seven Principles: Technology as Lever. American Association for Higher Education Bulletin. 49. 3-6.
- Coffield, F., Moseley, D., Hall, E., & Ecclestone, K. (2004). Learning styles and pedagogy in post-16 learning: a systematic and critical review. LSRC reference, Learning & Skills Research Centre.
- Crews, T. B., Wilkinson, K., & Neill, J. K. (2015). Principles for good practice in undergraduate education: Effective online course design to assist students' success. Journal of Online Learning and Teaching, 11(1), 87. Retrieved from http://virtualchalkdust.com/wp-content/uploads/2016/02/Crews_0315.pdf
- Delgado-Almonte, M., Andreu, H. B., & Pedraja-Rejas, L. (2010). Information technologies in higher education: Lessons learned in industrial engineering. Educational Technology & Society, 13(4), 140-154.
- Deterding, S. (2012). Gamification: designing for motivation. interactions, 19(4), 14-17. Retrieved from https://www.researchgate.net/profile/Sebastian_Deterding/publication/244486331_Gamification_Designing_for_motivation/links/0a85e53a049814673c000000.pdf
- Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011, September). From game design elements to gamefulness: defining gamification. In Proceedings of the 15th international academic MindTrek conference: Envisioning future media environments (pp. 9-15). ACM. Retrieved from https://dl.acm.org/citation.cfm?id=2181040
- FHSS Information Architect, (2017, July) Precision Teaching: Concept Definition and Guiding Principles, Retrieved from https://psych.athabascau.ca/open/lindsley/concept.php
- Gersten, R., Keating, T., & Becker, W. (1988). The continued impact of the Direct Instruction model: Longitudinal studies of Follow Through students. Education and Treatment of Children, 318-327.
- Ghani, J. A., & Deshpande, S. P. (1994). Task characteristics and the experience of optimal flow in human—computer interaction. The Journal of psychology, 128(4), 381-391.
- Glenn, D. (2009, December 15). Matching Teaching Style to Learning Style May Not Help Students. Retrieved February 21, 2018, from https://www.chronicle.com/article/Matching-Teaching-Style-to/49497
- Gurung, R.A.R, & Prieto, L.R. (2009). Learning styles as self-fulfilling prophecies. In Getting Culture: Incorporating Diversity Across the Curriculum (pp. 45-81). Stylus.
- Hamari, J., Koivisto, J., & Sarsa, H. (2014, January). Does gamification work?--a literature review of empirical studies on gamification. In System Sciences (HICSS), 2014 47th Hawaii International Conference on (pp. 3025-3034). IEEE. Retrieved from http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=6758978
- Huang, W. H. Y., Soman, D. (2013, December). A Practitioner’s Guide To Gamification Of Education. Retrieved from https://inside.rotman.utoronto.ca/behaviouraleconomicsinaction/files/2013/09/GuideGamificationEducationDec2013.pdf
- Kinder, D., & Carnine, D. (1991). Direct Instruction: What It Is and What It Is Becoming. Journal of Behavioral Education, 1(2), 193-213.
- Lawley, E. L., Phelps, A. (n.d.). “You Know You’re Going to Fail, Right?”: Learning From Design Flaws in Just Press Play at RIT. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.707.2212&rep=rep1&type=pdf
- Lilienfeld, S.O., Lynn, S.J., Ruscio, J., & Beyerstein, B.L. (2011). Myth #18 Students Learn Best When Teaching Styles Are Matched to Their Learning Styles. In 50 Great Myths of Popular Psychology (pp. 92-99). Wiley-Blackwell.
- Lindsley, O. R. (1992). Precision teaching: Discoveries and effects. Journal of Applied Behavior Analysis, 25(1), 51.
- Moran, D.J. & Malott, R.W. (2004), Evidence-Based Educational Methods. Academic Press
- National Institute for Direct Instruction, (2018, Februari 16), Basic Philosophy of Direct Instruction (DI), https://www.nifdi.org/what-is-di/basic-philosophy
- Nicholson, S. (2015). A recipe for meaningful gamification. In Gamification in education and business (pp. 1-20). Springer, Cham. Retrieved from http://scottnicholson.com/pubs/recipepreprint.pdf [pdf is preprint version with custom citation]
- Pollock, B., & Al-Bataineh, A. (2018). BENEFITS OF CURRENT EDUCATIONAL TECHNOLOGY: A COMPARISON OF STUDENT AND TEACHER PREPARATIONS IN A RURAL ILLINOIS SCHOOL DISTRICT. The Online Journal of Distance Education and e-Learning, 6(1), 17.
- Roberts, Tim. (2005). Computer-Supported Collaborative Learning in Higher Education: An Introduction. Computer-supported Collaborative Learning in Higher Education. 16. . 10.4018/978-1-59140-408-8.ch001.
- Rodríguez-Ardura, I. and Meseguer-Artola, A. (2017), Flow in e-learning: What drives it and why it matters. Br J Educ Technol, 48: 899–915. doi:10.1111/bjet.12480
- Rohrer, D., & Pashler, H. (2012). Learning styles: Where’s the evidence? Medical Education, 46, 34-35
- Sorcinelli, M. D. (1991), Research findings on the seven principles. New Directions for Teaching and Learning, 1991: 13–25. doi: 10.1002/tl.37219914704
- Sheehan, D. P., & Katz, L. (2012). The practical and theoretical implications of flow theory and intrinsic motivation in designing and implementing exergaming in the school environment. Loading... The Journal of the Canadian Game Studies Association, 6(9).
- Wang, M., Ran, W., Liao, J., & Yang, S. J. (2010). A performance-oriented approach to e-learning in the workplace. Journal of Educational Technology & Society, 13(4), 167.
- Welsh, E. T., Wanberg, C. R., Brown, K. G., & Simmering, M. J. (2003). E‐learning: emerging uses, empirical results and future directions. international Journal of Training and Development, 7(4), 245-258.
- Wesson, K., & Boniwell, I. (2007). Flow theory–its application to coaching psychology. International Coaching Psychology Review, 2(1), 33-43.
Impact of Literature Study
Based on the literature study that has been done, several items have been deemed important to include in our design. The impact of each of the topics which we researched is described below.
Flow Theory
Wesson and Boniwell (2007) describe seven conditions for getting in the flow. The following items are deemed important to include in the design of our learning assistant:
- Balancing challenge and skill: for our system this means that the level of the questions will be adapted to the individual’s capabilities with regard to certain topics.
- Having clear goals: for our system this means that a clear goal needs to be stated that a student can work towards by working with the system. For instance, for mathematics homework a goal could be to be able to apply the concepts taught in a certain chapter of the mathematics book that they use.
- Receiving clear and immediate feedback: for our system this implies that the student should be able to read their feedback immediately after answering the question. In an ideal system this feedback would also be adapted to the answer that the students has provided.
All three of these items are conditions for people to get ‘in flow’, enabling them to focus better on tasks and perform better.
Education by Human Teachers
Moran and Malott (2004) describe two important teaching models, the Direct Instruction (DI) teaching model and the Precision Theory (PT) teaching model. Both of these teaching models, as well as combinations of the two, have been shown to have a positive effect on the learning process, which is why we aim to include items from both models in our system.
The DI teaching model focuses on the importance of repetition and continuous practice of topics. Based on this teaching model we propose that our system should not entirely omit questions on topics that students scored well in in the past, instead it should ask questions on these topics that match the students level.
The PT teaching model allows students to reflect on their own work. Based on precision theory we propose to allow students to indicate the level they think they are at regarding a specific topic. The level indicated by the student will be taken into account by the system to select the initial difficulty of questions. The difficulty will still be adapted as a result of the student’s performance.
Learning Styles
During our research into learning styles we found that much of the research did not lead to conclusive results, and that there is a lack of evidence for the effectiveness of learning style models. This is why we will not adapt our system to the learning style of students. Research has indicated, however, that the teaching style should match with the topics that one is trying to teach.
Gamification
As Nicholson (2015) describes, reward based gamification (akin to operant conditioning) can lead to short term improvements, but different game-elements should be used to facilitate long term improvement. Gamification should also not be permanent.
For our system this means that when the students starts with a course, reward based gamification can be applied. As the student masters the easy level of homework we need to give them more freedom and enable them to find intrinsic value in the subject material. It is at this point that we make the gamification less reward-based (sparser points, less emphasis on level, etc.) but move on to meaningful gamification (using Nicholson’s RECIPE; Play, Exposition, Choice, Information, Engagement, Reflection). Eventually (when the student has found intrinsic value in the course material and is adequate at the topics presented in the course) the student will find that the gamification elements no longer add to the learning process of this course. As such we need to ease the student into homework that is no longer gamified. We could ask the student what they felt about their last homework session, and use that information, combined with their performance metrics to find out what kind of gamification we need to apply for the next homework session.
Using the work of Huang et al. It becomes important to thoroughly define the following constructs in order to guide the design of the system:
- Understanding target audience and context. This will change depending on the type of school that the system is designed for (for instance high school or elementary school, or a school for special needs students), another important factor for defining the context entails the study program. All in all the team should look thoroughly in order to understand the target audience and context of use as good as possible.
- Defining learning objectives (this will change on a course to course basis)
- Structuring the experience. Regarding learning, the experience could be structured based on the topics discussed during the course, as well as the difficulty level of subtopics.
- Identifying resources
- The application of gamification elements can follow based on Nicholson's work (2015).
From the work of Arnold (2014) we find that it is beneficial to figure out what type of gamer the user is. Not all gamer categories like the same game elements, therefore not all types of gamification may be appreciated by the user. We could for instance ask the user when they make their profile what types of games they like to play and why they like to play them, and determine a gamer type based on that information.
Online Learning Systems
The research of Chickering et al. (1987) gives good insight into practices that an online learning system should follow. When applied to intelligent adaptive online learning systems, the practices could be implemented like this:
- Maintain contact between the student and the faculty. This means that the online learning system should be a part of the education process, not a replacement for face-to-face teaching. Lectures and question hours need to be maintained next to the online system. One way to enhance this functionality would be to allow students to ask questions to the teachers, or make a suggestion that a topic receives extra attention in the next lecture. Using these techniques (as well as feedback on assignment performance) the bar is lowered for shy students or students that have difficulty in communicating to interact with their faculty.
- In order to facilitate active learning, the system should be able to produce and handle questions of varying nature. This could be facilitated by implementing projects (larger assignments that students optionally tackle in groups) or peer critiques.
- The system can facilitate good feedback if it is able to quickly and accurately give feedback regarding the performance of the student. The system should not only indicate that the answer was wrong, but it also has to indicate where exactly issues pop up in the answers of the student. By using these performance metrics teachers can adapt the course, in which case changes to the course can also be considered feedback.
- The system facilitates time management by adaptively choosing which homework is suitable for a student. Instead of working endlessly on easy exercises and not having enough time to work on the most difficult ones, the system ensures that students spend their time well. For elementary and high school students it may be beneficial to let the system help the student in planning when they need to work on their homework.
- The performance data gathered by the system can help the education program make extra efforts to help students progress through the course material. The system can communicate expectations that the school has by setting up milestones that students have to fulfill.
- The system should communicate clear guidelines for participation to the student.
Future of e-Learning
The system should be designed in such a way that even students who do not possess outstanding computer skills can use all the relevant components of the system. Although the problem of lower computer skills is decreasing as more children use these technologies, the design of the system should be inclusive enough that outliers of this trend can also use the system seamlessly. The design of the system should also be inclusive for teachers who do not possess outstanding computer skills.
It is of utmost importance that the system is reliable. As Pollock (2018) describes, ease of access is highly regarded by students. Teachers are interested in capabilities that entail information gathering. To summarize, the UI presented to students should be easy to use, and it is of utmost importance that the system is up and running at all times (minimal downtime for maintenance, and good recovery from errors). The information gathered about the students' performance should be presented in such a way that teachers can quickly discover where students are having issues. Perhaps a way to do this is to generate summary reports about the performance of the student. When the teacher accesses the profile of the student this summary is the first thing they see. For initial versions of the system, it can prove useful to interview the teachers who will use the system in the future.
In order to ascertain maximum ease of use, the system should integrate nicely into the other systems offered by the school. If the integration of the system is subpar (for instance, it takes unnecessary steps for teachers to access information) then the system may have adverse effects on the quality of learning. It goes without saying that teachers will be less likely to want to use the system if it is hard to use.
An important feature that the system should support entails the ability to quickly skim the contents of the entire course. In our system, students should be able to nudge the system so that they can receive assignments based on topics that the student wants to train again, even if the student already has a sufficient score for this topic.
The system should also purely focus on improving the homework experience. Face-to-face lessons should not be replaced by the system.
USE aspects
User
The system has two primary groups of users: the students and the teachers. This system can be very desirable for the teacher as it can give detailed insight into the performance of students. Using this information the teacher can figure out more easily what topics he should focus on more in lectures, and which students should receive more attention regarding questions about the course material. The system can also be very desirable for students, as it can help with learning how to deal with course material; e.g. how they should plan their homework and make the most efficient use of their available time to learn as much as possible. Furthermore, because the system uses flow theory education may become less boring and more engaging to students. This in turn may motivate them to strive to complete more difficult education programs. To summarize, when it comes to user aspects of the system, it can make education more personal and better adjusted to the needs of the students.
Society
As was discussed in the section on user aspects of the technology, the main benefits of the system entail that high school education can become more personalized to the needs of the students. If the system succeeds in making education less boring and more engaging it might be possible that more students are able to come by in higher forms of education. As such the general level of education in society may increase, which is beneficial in a highly automated information society.
Enterprise
The system can have large implications on the enterprise regarding education. If the system is used by many schools, then the media companies that write text books may very well adapt course material to be more compatible with this system. Schools, as an enterprise, need to adapt their infrastructure such that the system can run in tandem with the school’s remaining digital infrastructure, as such schools will have to invest time and money in integrating the system into their infrastructure. It depends on a school to school basis whether or not the system will be put to use, some schools might find the costs for setting up the system too much compared to the increase in student performance that the system can deliver. Third party companies might step in and host the services that the system requires, such that it becomes easier and/or cheaper for schools to start using it, as they no longer have to host the service themselves.
Outline of system functionality
NB: this is a description of a fully featured version of the system (it shows how the system would look like if it were to be implemented for an entire school), some of these items do not apply to the prototype (which focuses only on one course). Certain items on this list are too difficult to make in the limited time/programming experience we have, these items are recognizable as they include (advanced) in their description. Other features that are not necessary for core system functionality are written in italics. Even though the prototype will not include all the items featured here, it might be worthwhile to show what our intentions/vision for the system is. The lists below show which subcomponents are needed in the system, and what tasks each subcomponent needs to fulfill.
Server
- Maintains student profiles, which contain:
- Username
- Password (unique to this service)
- Real name of student (so that results can be linked to the school’s administration)
- Interests (advanced)
- Courses that the student partakes in
- Performance on topics that are handled in courses, for example:
- Course: history (4th year VWO)
- Knowledge on Industrial revolution (2/10) -> use introductory homework (amount: relatively many)
- Knowledge on renaissance (10/10) -> use advanced questions (amount: relatively few)
- Knowledge on stone age (6/10) -> use medium questions (amount: numIndRevolution > n > numRenaissance)
- Course: history (4th year VWO)
- Homework list (based on performance measure outlined above)
- Maintains course profiles, which contain:
- The topics that the course consists of (e.g. the chapters of a math book)
Study material
- Contains categorized assignments, ranging from
- 1. Introductory
- 2. Medium
- 3. Advanced
Client
- Web-based / app-based
- Student has username and password (to access/update profile stored on server)
- Student can input homework answers:
- Predetermined answers, such as calculations (math, physics, chemistry, etc.) or multiple choice questions
- Flexible answers: essay assignments (language, history, etc.)
- Student receives assignment list of homework based on the topics that are taught in class. The difficulty and amount of assignments per topic depends on the student’s performance on previous homework, and comments made by the student.
- The student can let the system know if he/she finds a particular topic difficult. Whether a topic is easy is determined by the system, as such the student can override the system’s decision and be given more homework on one topic, but it is not possible for the student to circumvent making homework by notifying the system that all the homework is easy.
- Considering the system targets middle/high-school students, it may be useful if the system can display useful tips regarding studying (e.g. make material easier to recall), the system could also include a planning service, where the student learns how to plan their homework to get everything done on time.
AI
- (advanced) If multiple versions of an assignment exist (for instance math problems with a story), then the assignment description is used that is most in line with the interests of the student.
- The performance of a student is based on:
- The difficulty of the exercise.
- The time it takes the student to complete the exercise.
- To prevent the system from making mistakes by assuming that a student has difficulty with a particular assignment when in reality they are just slow readers or unfamiliar with computers, a calibration of their typing skills will be included in the beginning.
- The number of completed exercises as compared to the total number of exercises.
- The number of hints used when attempting to solve a particular exercise.
- Adjusts the difficulty and amount of homework depending on the individual performance of the student
- The performance of the student is visible to the school administration and the teacher.
Summary
The system is an intelligent online learning system that is used to aid traditional teaching. It allows students to have a more personalized education. Students attend classes like they normally would, but the system keeps track of the student’s performance, and adjusts the difficulty and amount of homework depending on the individual performance of the student. In this way the student can more readily improve at topics that they find difficult, while knowledge on topics that they find easy is maintained. The performance of the student is visible to the school administration and the teacher. In this way the teacher can more easily find out if the class is struggling with a topic discussed in the course, and it allows the teacher to understand which students need the most help.
Progression of prototype
Client front-end implementation
https://dev.meesvandongen.nl/teachingassistant/
Feature | Completion |
---|---|
username and password | ✔️ |
Multiple Choice answers | ✔️ |
Text Input answers | ✔️ |
Advanced text input answers (essay) | ❌ |
Homework list | ✔️ |
The student can let the system know if he/she finds a particular topic difficult | ❌ |
System can display useful tips regarding studying | ❌ |
Planning service | ❌ |
Skipping Questions | ❓ |
Topics | ❓ |
Research on existing open source systems
Open source learning software: Moodle. Has plugin system.
A discussion on the moodle boards about our subject: CAT learning (computer adapted test): https://moodle.org/mod/forum/discuss.php?d=159682 Some papers are mentioned here, which could prove useful in our design.
Research on the decision making system
We must match a user with a question. For this we must use the knowledge we (the system) have.
What we know of the user:
- Some measure for how skilled they are at certain types of questions based on previous questions answered
- How long they have been answering questions for
- At which speed they are answering questions
- How good they are at answering questions right now (relative to their estimated skill)
The last 3 things could be used to detect that the user is currently in a state of flow and can for example be asked more difficult questions.
What we know of each question:
- A category (set manually or automatically) (We can limit our system to 1 category for the sake of the prototype)
- It is possible to have multiple categories per question (each individual thing a person has to know to answer a question)
- A measure of difficulty (based on other people answering the question) . Having this automated has the additional benefit of telling us which questions people find most difficult. The downside is that the system has to learn the difficulties before it becomes effective which can be partially solved by starting with a default value.
- The speed of the question being answered relative to the ratio of it being answered correctly: for example a trick question would be answered incorrectly a lot despite being answered relatively fast. This can also be used to see if maybe a user should just take more time to think or if they should have been able to answer it sooner.
For a person to have mastered the subject in question they must be able to answered most question correctly so our performance measure for the users should be based on this.
Our system could work something like this:
- The score a user has for every category starts at 100
- When a user answers a question correctly/incorrectly their score goes down/up based on the difficulty score of the question.
- When the score for a category reaches 0 they no longer get asked questions in that category
The above things are only the basics, how a questions gets chosen and how the scores change is something that there is no 1 obvious choice for so from here the proposed rules are all options and do not all have to be chosen
The score of a question can be the percentage of times it gets answered correctly. With this we can propose the following system for choosing the questions:
- Choose a question with a category for which the users score is the lowest , ask multiple questions with this category in a row.
- Do not choose a question that has been answered correctly before
- Do not choose a question that has been answered incorrectly recently
- Specifically ask a question again that was answered incorrectly earlier (when the user should definitely know the answer now)
- Choose the question for which the score is closest to the score of the user
- If a question with multiple categories is answered incorrectly ask questions that have only one of these category to find out which category(sub problem) is the problem, if all of these are answered correctly ask another question with all these categories combined.
For Changing the users score:
Correct answer:
- When a question is answered correctly subtract some constant c from the score
- Subtract additional points for multiple correct answers in a row
- Subtract more points for a more difficult question being answered
- Subtract more points for a quick answer (relative to the users average speed compared to the global average for other users for that question)
Incorrect answer:
- Add more points of score recently went up by a lot (this question was much more difficult than the previous ones)
General notes:
- The actual numbers should be fine-tuned based on how long it should take to master questions in a category.
- When looking at the speeds for answers being given we should filter out outliers (someone leaves a question open for 1 hour while doing something else)
- Some things the system does should be visible to teachers. Things like the questions and questions categories which individual students find most difficult as well as the ones that are most difficult universally among all students are useful for teachers to know.
- Faulty questions will be answered ‘correctly’ approximately 0% of the time which means they can be filtered out automatically.
From literature:
Search terms?
- Computer adaptive test (Mentionioned on the moodle forum (2010) https://moodle.org/mod/forum/discuss.php?d=159682 )
- The forum mentions http://carla.umn.edu/assessment/CATfaq.html and http://echo.edres.org:8080/scripts/cat/catdemo.htm (Lawrence M. Rudner)
The second link describes a system that works similarly to the one described above. Their system works as follows:
- All the items that have not yet been administered are evaluated to determine which will be the best one to administer next given the currently estimated ability level
- The "best" next item is administered and the examinee responds
- A new ability estimate is computed based on the responses to all of the administered items.
- Steps 1 through 3 are repeated until a stopping criterion is met.
It describes a more complicated way of choosing questions which is backed up with statistics (which question should theoretically give us the most information about the users skill level). This does not take into account the psychological effects of the difficulty of the questions and is geared towards determining the skill level of people as opposed to increasing it.
I propose we use an established system as described above to determine skill levels but adapt it so that it also will try to train the students on their weak points and motivate them by asking the right questions at the right times. The speed at which the questions get answered is something that we can also consider taking into account (speed is not considered in the mentioned existing system).
Testing the prototype
In this section we describe how we want to test the prototype. Testing consists of two phases. First we want to hold a focus group with teachers of a local high school in order to refine our concept. For the second stage of testing we want to let students of a local high school interact with the system. These plans are not yet final, as such they are subject to change. In case we are not able to test the system at a school we will instead try to hold an experiment among students at the TU/e.
0LAUK0 - Focus group scheme draft
Participants: ideally 1 teacher of every course, or most math teachers.
Time: TBA
Location: TBA
Date: 19-03-2018 - 23-03-2018
Introduction
[Welcome participants]
Let us first introduce ourselves, we are students from Eindhoven University of Technology. We are currently working on the project "Robots everywhere". Within this project we attempt to do something new and interesting with the research done on robotics and artificial intelligence. Our team has chosen to research an interactive online learning system that can adapt the homework based on the strengths and weaknesses of students.
In this focus group we want to discuss the possibilities of an adaptive online learning system that can aid your students in making their homework. This system will use some form of artificial intelligence to measure the performance of students, and to decide what topics need the most attention in their homework. You, as teachers, can use these performance metrics to gain detailed insight in the performance of every student. This can also allow shy students to more easily show that they need attention when it comes to certain topics that are taught in your courses. The aim of this system is to motivate students to make efficient use of their time when making homework, the system will supplement traditional lessons.
Moving on to the planning of today. In this focus group you will discuss topics that are relevant to this system. We are interested in a discussion amongst our participants, to find out what you really find important, so we can use that information to improve our design. Our role as researchers is to moderate this discussion.
Before we continue we would like you to sign these informed consent forms. Notes will be made of this discussion, as well as an audio recording. During analysis of this data your personal information will be removed. Everything you say here will be treated confidentially. On your request we will destroy the data after the research is completed.
We cannot stress enough that your participation in this focus group is completely voluntary. You have the right to withdraw from the focus group at any time. We will gladly answer your questions regarding the research. There may be certain questions which we cannot answer before the main discussion starts, but we will come back to them during the debriefing. To confirm that you agree to these conditions we kindly ask you to sign these forms. [hand over informed consent forms]
[Misschien moeten we ook nog kijken naar een manier om makkelijk de deelnemers voor ons zelf te identificeren, met naambordjes bijvoorbeeld.
Topics to discuss
- [moet nog uitgebreid worden]
- Opinions toward using such a system to enhance face-to-face teaching
- How would you like this system to work?
Debriefing
Test plan
Research question
Does the use of an adaptive learning agent affect students’ enjoyment and learning efficiency when compared to the use of a non-adaptive learning agent?
Hypothesis
The adaptive learning agent (ALA) is more pleasant for students to work with and improves the students’ learning efficiency when compared to an non-adaptive learning agent (NLA).
Research objective
To investigate the effects of using an adaptive learning agent — that is designed according to current research on flow theory, teaching models, gamification and e-learning — on teaching high-school students.
Method
Required is a self-study period of at least 35 minutes. The experiments start off with a short 5-minute test to assess the students’ initial level. This test contains more questions than the students can possibly finish in the five minutes. The tests will be scored based on how many answers the students gave and how many of those were correct. After that, the students will be given time to practice the subject matter. A third of the class will be given the ALA, another third the NLA. Both will be programmed with the subject matter relating to the last instruction. Neither of these groups will know to which group they belong. The remaining third will simply be given the book. After a fixed time, all students will be given another test, similar to the one they were given in the beginning. At the end, all students that used the learning agents will be given a short questionnaire about their experience and where they can indicate what they think the system lacks.
Schedule
- 5-minute instruction
- 5-minute test on the subject matter
- Practicing (total study time minus 20 minutes)
- 5-minute test on the subject matter
- 5-minute reflection
When testing is done using an online questionnaire, the participants will be separated in two groups, one with the ALA, one with the NLA. The schedule remains the same, except the instruction will simply be written in text, and no fixed time interval has to be taken into account. The same goes for the reflection. Participants ‘are done when they are done’.