Tag Archives: survey data

OER in schools

Despite great initiatives like the DigiLit Leicester project on these shores and K12 OER Collaborative in the US, OER folks’ attention does not focus often enough on the K-12/school scene. TJ Bliss writes “If we want OER to become the default, we need people to use OER and to know that they are using OER. In my experience, lack of OER awareness runs mostly unopposed among schoolteachers, who more than anyone should be supported in championing open education.

If you do a quick search for studies of OER use in schools, the return is only a handful of publications and reports. Our open-access paper to come out hopefully in the next few months will help fill this gap. In the meantime, the infographic below presents a frequencies analysis of data collected from surveys conducted by the OERRH until December 2014, in total a sample of 657 K-12/school educators across the globe. Open Education week is nearly over and we are seeing it out with a bang!

Download pdf here.

Data on the use of OER by school educators

Advertisements

Cleaning our way to a monster dataset

In February of 2013 the newly put together OERRH team completed the humongous task of creating a bank of survey questions which would be one of the main research instruments to collect data around the project’s eleven hypotheses. Bear one thing in mind: at the time, each of us was working with a different collaboration –OpenStaxSaylor AcademyFlipped Learning Network,OpenLearnTESS-IndiaCCCOER, etc.; initially, each collaboration was allocated a different hypothesis, which also meant a different pick of questions from the survey bank and a different version of the survey. I’ll give you a couple of examples: our collaboration with the Flipped Learning Network originally focused on teachers’ reflective practices so flipped educators never answered questions on the financial savings of using OER; students using OpenLearn were not asked about the impact of OER on teaching practices; informal learners did not have questions that related to formal studies, and so on. In addition, collaborations had a stake in the research and input in the design of their survey: questions were discussed further, tweaked, piloted and tweaked again ahead of launching. All in all, we put together 18 different questionnaires. The idea was always there to merge all data into one massive file (what I called the MONSTER) that would allow us to undertake comparative analysis. What follows is the official record of how I laboriously coded, recoded, corrected, deleted and cursed (a bit) through the OERRHub surveys in order to have a squeaky clean dataset.

SurveyMonkey and SPSS don’t talk to each other that well

Every researcher knows that there are errors and inaccuracies that need to be ironed out before you commit yourself to analysing quantitative data. We are all human, right? On this occasion, for the first complication that came my way, I’m gonna blame the software: when exporting data from SurveyMonkey as anSPSS file, your variable labels and values will get confused. Let me explain: say you want to find out about OER repositories, so you create a list in SurveyMonkey and ask respondents to tick options from it to answer the question ‘Which OER repositories or educational sites have you used?’. If you expect the list to appear as variable labels in SPSS, it won’t. Instead, the software will repeat your question in the Label box and use the name of the repository in the Values box with a value of 1.

SPSS1

As it happens, the wonderful OER researcher Leigh-Anne Perryman had a solution in her bottomless bag of tricks: the question design in SurveyMonkey had to be amended for future respondents to have the option to tick either ‘yes’ or ‘no’ for each of the repositories on the list. To sort out the damage with any data already collected, what needed to be done was manually input the name of the repository in the label box, and give the variable a value of 1=yes and 2=no. Tedious but easy to fix.

SPSS2

Editing the survey questions to include a yes/no answer also served to remedy another software mishap: the fact that SurveyMonkey does not differentiate a blank answer from a ‘no’ answer when downloading results as a SPSS file. On this occasion, the required fix wasn’t quick. I closely inspected the data case by case: if the respondent did not choose any of the options in a particular question, I considered each a ‘missing’ value; if the respondent ticked just one option, the blank answers were recoded into a ‘no’ value.

Another curious instance of having to recode data was spotted by Beck as the two of us marvelled over having responses from a total of 180 different countries in the world: I can’t recall whether this was a default list in SurveyMonkey but for some reason Great Britain and the United Kingdom were given as separate choices. Obviously, these had to be combined into one.

Correcting human errors

I put my hand up. The OERRH surveys aren’t exactly short and sweet. As a result, and this is my own take on the matter, the data suffered. In some cases, respondents provided the demographic information but did not answer anything else; they were deleted from the final dataset. Exact fate met those who selected all options in one question, despite being mutually exclusive –I find it hard to believe that someone is studying in school and getting a degree while doing a postgrad at the same time, don’t you?

I’ve decided that for some respondents it must have been easier to provide an answer in the comments box than reading through all the available options; what other explanation can you find for a teacher who answers the question ‘What subject do you teach?’ by writing ‘Engineering’ in the ‘Other’ field instead of ticking that from the 17 items at his disposal? Duly noted and corrected.

In other cases, for instance, respondents would leave unticked ‘MOOCs’ when asked about what type of OER they use, but then add as an open comment that they studied with Coursera or EdX. These had to be corrected as well.

Although written in English, the OERRHub surveys were distributed world-wide: it is difficult to anticipate where people might find the language a barrier, but here is an example: we used the word ‘unwaged’ to inquire about employment status; several respondents left the option unmarked, but indicated “Unemployed” or “No job” in the comments field. Again, these cases were corrected accordingly.

Merging data

Cleaning data is always painstaking work, especially when you are handling thousands of cases, but let’s face it, it is also mostly uncomplicated. What could have been if not avoided at least attenuated was the trouble that I saw myself in when having to merge the data from the eighteen OERRHub surveys. As days went by, the monster dataset grew fatter and fatter, but my love for my colleagues (and myself) grew thinner and thinner. Why? It is true that each of the individual surveys had to be customised as per collaboration but we researchers were a tad undisciplined: there were unnecessary changes to the order in which options were presented, there were items added and items subtracted, and wording altered without consultation. All this made data merging more time-consuming, cumbersome and fiddly than it should have been.

All is well that ends well though. We have a clean dataset that comprises of 6390 responses and is already producing very interesting results. Here is one of the lessons learnt: if you are dealing with multiple researchers and multiple datasets, nominate a data master: one to rule them all and bind them, although not in the darkness. Darkness is bad, open is good.

Flipped Learning and OER: Survey Results

Yesterday the Flipped Learning Network (FLN) announced a formal definition of ‘flipped learning’, a timely reminder for me to share the results of the survey that the OERRHub Project conducted with flipped educators to find out about their use of open educational resources (OER). I blogged about our relationship with the FLN and how this research piece came to be at the launch of an infographic featuring some of these data; here comes a recap and more.

Sample

109 U.S. teachers practising flipped learning in their K12 classrooms completed the questionnaire. A majority (68.8%, n=75) have over ten years of teaching experience but have been using the flipped model for two or less than two years (91.7%, n=100). Subject-wise Science (37.3%, n=40) and Math (32.7%, n=35) top the poll, followed by Social Studies (20.6%, n=22) and World Languages (10.3%, n=11). More teachers are flipping their classrooms in the higher grades –67.8% (n=74) in K9-12 as opposed to 8.2% (n=9) in K2-5. Most teach in suburban schools (63.2% n=67), while rural and urban communities figure less often, 20.7% (n=22) and 16% (n=17) respectively. 61.5% (n=67) of respondents are based in districts where the percentage of students on free or reduced lunch is below 50%. This profile is consistent with that reported in June 2012 following a previous survey of 450 flipped educators, on which we based our demographic questions (see Improve student learning and teacher satisfaction in one Flip of the Classroom). In other details, we learn that a huge majority of these teachers have accessed the internet at home using a broadband connection (98.2%, n=107); they present their work at staff development events (74.8%, n=80) but are not in the habit of publishing their teaching presentations publicly online (11.2%, n=12).

To use or to adapt, is that the question?

Not really, the percentage of teachers who say they adapt resources to fit their needs in a flipped classroom (82.5%, n=90) is much higher that those who report using resources as they find them (17.4%, n=19). The pattern is repeated across grade levels (see chart below). Note that only 2 teachers in K2 answered the survey.

USEorADAPT

Types of resources

It does not come as a surprise that 95.3% of respondents select videos when asked to indicate the types of free online resources they use in the flipped classroom; after all, moving instruction outside the group learning space into the individual learning space, the essence of a flipped class, is most commonly done by means of recording videos and asking kids to watch them as homework. The percentage of teachers who report they use images is even higher (96.2%, n=102) but other online resources slip down the ranks: 56.6% (n=60) use interactive games, 51.9% (n=55) use tutorials and 46.2% (n=49) quizzes. What is interesting is that when compared with the types of resources flipped educators create, images drop to 33.3% (n=30) while videos keep their high position (83.3%, n=75).  This clearly suggests that teachers are following the advice from flipped class pioneer Jon Bergmann that making your own videos helps teachers establish a relationship with students. What I find puzzling is that in an age when it is so easy to snap and share, these teachers are mainly consumers and not producers of images.  Another interesting point was noted by Kari Arfstrom, Executive Director of the FLN, during her visit last September: the difference between teachers who create and use quizzes is slightly skewed towards production, 57.8% (n=52) against 46.2% (n=49), perhaps an indication that they prefer to set their own tests, as they have a better overview of how students have responded to the content taught.

Types Used vs Adapted

Creating OER?

43.3% (n=42) of teachers in our sample report that they create resources and share them publicly online; however, the number of teachers who say that they create resources and share them publicly online under a Creative Commons license drops significantly to 5.1% (n=5). In connection with this, 47.2% (n=46) say that they are familiar with the CC logo and its meaning, which still leaves an important proportion of respondents declaring that they either have not seen the CC logo, or they have seen it but do not know what it means. As for open licensing, 70.1% (n=68) consider it is important or very important when using resources in their teaching.

Repositories and sharing

The chart below shows the most common repositories accessed by flipped educators in our sample when looking for free online resources. YouTube, YouTubeEdu and YouTubeSchool (93.4%, n=99), together with TED talks (66.9%, n=71) and iTunesU (54.7%, n=58) unsurprisingly rank amongst the most popular. Slightly unexpected in my opinion is how little knowledge there is of major repositories of resources for the K12 sector –note, for instance, CK12 (16.9%, n=18) or Curriki (7.5%, n=8).

Repositories

What do teachers do when they access these repositories? The most common behaviour is downloading resources, 81.4% (n=70). Then 38.3% (n=33) say that they have uploaded a resource, 30.2% (n=26) that they have added comments regarding the quality of a resource, and 15.1% (n=13) that they have added comments suggesting ways of using a resource. The most popular way of sharing is via email (89.5%, n=94) and in person (67.6%, n=71).

Purposes of using OER

The chart below shows responses to the question For which of the following purposes have you used free online resources in the context of flipped learning?

Purposes

Challenges to using OER

When asked about the challenges that they most often face when using free online resources, 70.4% (n=69) of respondents in our sample indicate that they do not have enough time to look for suitable material;  for 65.3% (n=64) it is finding resources of sufficient high quality, and for 59.1% (n=58) finding suitable resources in their subject area. Equally interesting is what they consider the smallest challenges: not knowing how to use the resources in the classroom (7.1%, n=7); lacking institutional support for their use of free online resources (13.2%, n=13); and resources not being aligned with professional standards or regulation (14.2%, n=14). If anyone doubted the technical abilities of teachers, only 15.3%  (n=15) think that not being skilled enough to edit resources to suit their own context is a barrier to using OER. And a personal favourite of mine: a skimpy 29.5% (n=29) declare that not knowing whether they have permission to use or modify resources would actually stop them from using them. If we imply then that roughly 70% know when they are allowed to adapt a resource (which is consistent with the number of teachers who say they care about open licensing), how come the percentage of those aware of Creative Commons licenses is much smaller?

Impact on teaching practices

93.8% (n=91) of K12 teachers in our sample agree or strongly agree that as a result of using OER in the flipped class they use a broader range of teaching and learning methods –indeed I would argue that this is exactly what the flipped model facilitates: what is the best use of my classroom time? Not lecturing at students for forty minutes but opening up the space to different, more engaging and participative ways of learning. 89.7% (n=87) agree or strongly agree that they make use of a wider range of multimedia, and 88.6% (n=86) that they reflect more on the way that they teach. Where K12 teachers say their use of OER has had the least impact on their teaching practice can be listed as follows:

  • I make more use of culturally diverse resources (51.4%, n=49)
  • I have more up-to-date knowledge of my subject area (69%, n=67)
  • I more frequently compare my own teaching with others (70.1%, n=68)
  • I collaborate more with colleagues (70.1%, n=68)

Impact on students

91% (n=81) of teachers agree or strongly agree that their use of OER allows them to better accommodate the needs of diverse learners, a thought that resonates with the benefits of flipping the class: teachers have more one-to-one time with students, and students are often allowed to progress at their own pace.  It has been previously recorded (see for instance A Case Study: Flipped Learning Model Increases Student Engagement and Performance) that flipping the classroom makes for happier students; flipping with OER does not seem to deviate from this: 85.2% (n=75) of K12 teachers agree or strongly agree that their use of OER in the flipped class increases learners’ satisfaction with the learning experience. In third spot comes a statement that invites to think of teaching as forging personalities: 81.8% (n= 71) agree or strongly agree that their OER use helps develop learners’ increased independence and self-reliance. At the other end of the spectrum, only 42% (n=37) agree or strongly agree that using OER has any bearing on students at risk of withdrawing actually continuing with their studies; 52.8% (n=47) that it leads to learners becoming interested in a wider range of subjects than before; and 59.5% (n=53) that it increases their enthusiasm for future study.

Beatus ille

Photo by alles banane CC BY-NC-ND 2.0

Photo by alles banane CC BY-NC-ND 2.0

A quick Google-search on the benefits of OER for students will easily deliver a number of hits calling upon, for instance, their power to encourage more independent and flexible learning opportunities, and to facilitate exploration of materials ahead of enrolment, allowing learners to choose more wisely and also be better prepared. The JISC OER infoKit adds, amongst other, freedom of access and the international dimension that comes from being able to apply knowledge beyond the confines of one course. One would think that OER use comes as a ray of sunshine, but to what extent do OER increase student satisfaction?

As it is often the case, our research to date derives mostly from asking teachers about their beliefs on the impact of OER use on the students’ learning experience rather than asking learners themselves.  Even then, we have found that educators and learners don’t agree with each other: while the former are generally convinced of the goodness of OER, the latter are less solidly inclined to declare themselves entirely satisfied with open practices.

On the impact of OER on student satisfaction, data extracted from surveys conducted with two of our collaborations (OpenLearn and the Flipped Learning Network) make apparent this discrepancy. For example, 63% of educators using OpenLearn (n=31) agreed that open educational resources improve student satisfaction, an opinion shared by 85% of K12 teachers engaged in flipped learning (n=75). However, just 47% (n=54) of formal learners indicated that their satisfaction with the learning experience was boosted by their use of OpenLearn resources.

When we talk about OER in relation to student performance, the story repeats itself. If we consider improved performance in terms of an increase in grades, only 14% (n=16) of surveyed students indicated that they had achieved higher marks as a result of using OpenLearn. Educators, on the other hand, took a more optimistic stance: 44% (n=21) agreed that using OpenLearn leads to greater student grades, and 63% of K12 teachers (n=55) agreed that using free online resources in the flipped classroom contributes to higher test scores.

Our surveys also included questions to canvass non-grade related aspects of performance such as students’ participation in class discussions, their involvement with lesson content, etc. The results paint a similar picture of dissent, as the chart below shows.

Impact of OER use on non-grade related aspects

Impact of OER use on non-grade related aspects

Perhaps stronger evidence on the impact of OER use on student performance and satisfaction comes from those research studies that have been able to implement comparison points. According to the Bridge to Success Final Report, pass rates (A-C grades) from students taking the Succeed with Math (SWiM) course increased from 50.6% to 68.6%. To validate this data, the pass rates of a similar sample of students taking English or Reading (ENG/RDG) coded courses were also collected. In this case there was little difference in test scores between concurrent (69.5%) and following (70.3%) cohorts, suggesting that it is reasonable to consider students’ involvement in the SWiM course as contributing to their improved performance in the subject.

The Math Department at Byron High School in Minnesota tells another happy story. Pushed by financial constraints, math teachers committed to creating a textbook-free curriculum by 2010, as they adopted the flipped classroom model: a Moodle course served as spine for each classroom, where teachers embedded YouTube videos for students to watch as homework. Students not only welcomed the lighter weight in their backpacks, but also gave the approach the thumbs-up when it came to exam time: Math mastery danced from 29.9 % in 2006 to 73.8 % in 2011, and ACT scores from an average of 21.2 (on a scale of 36) in 2006 to 24.5 in 2011 (Fulton, 2012). One caveat needs to be raised, in my opinion: that teachers’ involvement in using flipped learning techniques is as likely to account for maximising learning as their use of OER. To throw a spanner in the works, further evidence on the substitution of traditional textbooks by open textbooks in the K12 science classroom does not corroborate an increase in students’ test scores (Wiley et al., 2013).

Although more research is needed to strengthen these findings, on the subject of open textbooks the achievements of OpenStax give resounding support to the link between OER and satisfaction: at the end of June 2013, OpenStax textbooks had been downloaded over 120K times, just over 17 million unique visitors had accessed the materials and 200 institutions decided to “formally adopt” OpenStax materials: over $3 million savings for students (OpenStax July Newsletter).

In the words of Horace, beatus ille qui exercet OER.