Category Archives: Research

OER at #iNACOL14

 

The 9th (and to date largest) iNACOL Blended and Online Symposium has now concluded. Joining the nearly 3,000 attendees at the massive Palm Springs Convention Center, I made my way from Pueblo to Sierra via San Jacinto and Catalina, ice cream in one hand and lemonade in the other, navigating the talks in the OER participant track –and believe me, with over 200 concurrent sessions packed in two and a half days, I very much welcomed a path to follow.

It wasn’t by coincidence that in her welcome address Susan Patrick, CEO identified open education and OER as one of the top ten trends driving the future of education: iNACOL are key contributors to the development of OER through policy –see for instance OER State Policy in K-12 Education: Benefits, Strategies, and Recommendations for Open Access, Open Sharing and OER and Collaborative Content Development.

At this year’s conference Karl Nelson, Director of the Digital Learning Department for the Washington State Office of Superintendent of Public Instruction (OSPI) referred to current legislature to foreground his talk on evaluating OER: the state of Washington’s “recent adoption of common core K-12 standards provides an opportunity to develop a library of high-quality, openly licensed K-12 courseware that is aligned with these standards”. The familiar ‘it may be free, but is it any good?’ case initiated a review process of OER in Math and English Language Arts (ELA) to help educators select high-quality resources, provide information for materials adoption and identify gaps in alignment with Common Core State Standards (CCSS). Not small OER, mind you, but full courses that districts could adopt rather than spend money in a textbook.

The evaluation rubric combines five existing review instruments. For breadth, Common-Core alignment; publishers’ criteria, an overview of curricular materials (i.e. entire courses) that integrates content and practice; and reviewers’ comments, –‘Would you use this material in your classroom?’, ‘What is the ideal scenario for this resource?’, etc. For depth, the EQuiP rubric, which is unit-focused and measures overall quality when compared to CCSS; and a subset of the Achieve OER rubric, designed to evaluate the quality of digital materials.

The outcome of the review is not only an important library of K-12 open resources, but also a methodology for districts to replicate as they adopt OER. Kudos to both efforts but my slight gripe with spreading this ‘how-to’ is that, at least on first impressions, it’s a fairly complicated task even for a dedicated and trained team of educators/reviewers.

“Teachers think they don’t have the stuff to make Common Core work”, said Karl; this gap is about to be filled by K-12 OER Collaborative, a state-led project supported by Creative Commons, Lumen Learning and others. Nelson was nearly as tight-lipped as his co-presenters, Jennifer Wolfe from The Learning Accelerator and Layla Bonnot from the Council of Chief State School Officers (SSOO), or at least just enough to build up the excitement about an official RFP likely to be announced during OpenEd next week: the call to create openly-licensed, high-quality, common-core aligned comprehensive modules for K12 Math and ELA will be open to all content developers. Interested? Watch the space.

The slides for my own presentation ‘Teaching and Learning with OER: What’s the Impact in a K12 (Online) classroom?’ are available here.

Advertisements

Cleaning our way to a monster dataset

In February of 2013 the newly put together OERRH team completed the humongous task of creating a bank of survey questions which would be one of the main research instruments to collect data around the project’s eleven hypotheses. Bear one thing in mind: at the time, each of us was working with a different collaboration –OpenStaxSaylor AcademyFlipped Learning Network,OpenLearnTESS-IndiaCCCOER, etc.; initially, each collaboration was allocated a different hypothesis, which also meant a different pick of questions from the survey bank and a different version of the survey. I’ll give you a couple of examples: our collaboration with the Flipped Learning Network originally focused on teachers’ reflective practices so flipped educators never answered questions on the financial savings of using OER; students using OpenLearn were not asked about the impact of OER on teaching practices; informal learners did not have questions that related to formal studies, and so on. In addition, collaborations had a stake in the research and input in the design of their survey: questions were discussed further, tweaked, piloted and tweaked again ahead of launching. All in all, we put together 18 different questionnaires. The idea was always there to merge all data into one massive file (what I called the MONSTER) that would allow us to undertake comparative analysis. What follows is the official record of how I laboriously coded, recoded, corrected, deleted and cursed (a bit) through the OERRHub surveys in order to have a squeaky clean dataset.

SurveyMonkey and SPSS don’t talk to each other that well

Every researcher knows that there are errors and inaccuracies that need to be ironed out before you commit yourself to analysing quantitative data. We are all human, right? On this occasion, for the first complication that came my way, I’m gonna blame the software: when exporting data from SurveyMonkey as anSPSS file, your variable labels and values will get confused. Let me explain: say you want to find out about OER repositories, so you create a list in SurveyMonkey and ask respondents to tick options from it to answer the question ‘Which OER repositories or educational sites have you used?’. If you expect the list to appear as variable labels in SPSS, it won’t. Instead, the software will repeat your question in the Label box and use the name of the repository in the Values box with a value of 1.

SPSS1

As it happens, the wonderful OER researcher Leigh-Anne Perryman had a solution in her bottomless bag of tricks: the question design in SurveyMonkey had to be amended for future respondents to have the option to tick either ‘yes’ or ‘no’ for each of the repositories on the list. To sort out the damage with any data already collected, what needed to be done was manually input the name of the repository in the label box, and give the variable a value of 1=yes and 2=no. Tedious but easy to fix.

SPSS2

Editing the survey questions to include a yes/no answer also served to remedy another software mishap: the fact that SurveyMonkey does not differentiate a blank answer from a ‘no’ answer when downloading results as a SPSS file. On this occasion, the required fix wasn’t quick. I closely inspected the data case by case: if the respondent did not choose any of the options in a particular question, I considered each a ‘missing’ value; if the respondent ticked just one option, the blank answers were recoded into a ‘no’ value.

Another curious instance of having to recode data was spotted by Beck as the two of us marvelled over having responses from a total of 180 different countries in the world: I can’t recall whether this was a default list in SurveyMonkey but for some reason Great Britain and the United Kingdom were given as separate choices. Obviously, these had to be combined into one.

Correcting human errors

I put my hand up. The OERRH surveys aren’t exactly short and sweet. As a result, and this is my own take on the matter, the data suffered. In some cases, respondents provided the demographic information but did not answer anything else; they were deleted from the final dataset. Exact fate met those who selected all options in one question, despite being mutually exclusive –I find it hard to believe that someone is studying in school and getting a degree while doing a postgrad at the same time, don’t you?

I’ve decided that for some respondents it must have been easier to provide an answer in the comments box than reading through all the available options; what other explanation can you find for a teacher who answers the question ‘What subject do you teach?’ by writing ‘Engineering’ in the ‘Other’ field instead of ticking that from the 17 items at his disposal? Duly noted and corrected.

In other cases, for instance, respondents would leave unticked ‘MOOCs’ when asked about what type of OER they use, but then add as an open comment that they studied with Coursera or EdX. These had to be corrected as well.

Although written in English, the OERRHub surveys were distributed world-wide: it is difficult to anticipate where people might find the language a barrier, but here is an example: we used the word ‘unwaged’ to inquire about employment status; several respondents left the option unmarked, but indicated “Unemployed” or “No job” in the comments field. Again, these cases were corrected accordingly.

Merging data

Cleaning data is always painstaking work, especially when you are handling thousands of cases, but let’s face it, it is also mostly uncomplicated. What could have been if not avoided at least attenuated was the trouble that I saw myself in when having to merge the data from the eighteen OERRHub surveys. As days went by, the monster dataset grew fatter and fatter, but my love for my colleagues (and myself) grew thinner and thinner. Why? It is true that each of the individual surveys had to be customised as per collaboration but we researchers were a tad undisciplined: there were unnecessary changes to the order in which options were presented, there were items added and items subtracted, and wording altered without consultation. All this made data merging more time-consuming, cumbersome and fiddly than it should have been.

All is well that ends well though. We have a clean dataset that comprises of 6390 responses and is already producing very interesting results. Here is one of the lessons learnt: if you are dealing with multiple researchers and multiple datasets, nominate a data master: one to rule them all and bind them, although not in the darkness. Darkness is bad, open is good.

Evening OERthlings, greetings from Mars!

I’m in Mars, Pennsylvania Route 228, population 1,699. Most notable Martian? A certain Gino Crognale, of renown in the make-up world for his artistry in The Walking DeadFlipCon14, the 7th Annual Flipped Learning Conference, is about to start and it’s not zombies descending on these 0.5 square miles of Butler County, but hordes of teachers with one thing on their minds: the flip!

This year conference organisers have divided presentations into different strands, so it’ll be difficult to come away not having found what you travelled here for, whether you are an experienced flipped educator, new to flipped learning, a sponsor showcasing your products or, like me, a humble researcher in the land of the free.  Apart from plenaries and plenty of opportunities for networking, there are in total six slots of concurrent sessions where on site attendees will have to choose from up to twelve different talks each. My interest will stay with the research strand during the two days, so here’s what I’m looking forward to:

Katherine McKnight, Principal Director of Research for the Center for Educator Learning and Effectiveness at Pearson, and Jessica Yarbro, George Mason University will review research on teaching and learning with video.

Irma Brasseur-Hock and Meghan Arthur, University of Kansas Center for Research on Learning, will share the research findings of a project involving a survey of 142 members of the Flipped Learning Network and focus groups interviews. What was exactly their research question? I’m eager to find out.

Kari Arfstrom and Taylor Pettis will compare the results from the most recent survey of flipped educators to that of two years ago in terms of the who, what, why, where and when of flipped learning.

Chris Luker, Chemistry Teacher from Highland Local Schools, will chair a round-table discussion with the aim of “collaboratively organizing research ideas that will continue to support the flipped learning model”. Ah, the need for research! Can’t beat it.

And Lucy Kulbago, John Carroll University, will examine student attitudes toward science and conceptual gains in a flipped undergraduate introductory physics course for life science majors.

There is plenty to tempt me away from this my initial choice, but I shall hope to stick to my guns and feed back in a couple of days.

Now where are those zomb… I mean flipped educators?

Photo: Pennsylvania Route 228 CC BY B. de los Arcos

 

I flip, you flip, she flips…

I’m off to Belfast tomorrow for a Staff Development Languages Day at The Open University’s elegant headquarters on Victoria Street. I usually enjoy these gatherings very much; there are few of us languages tutors in Ireland compared to the other regions and nations, and Saturday will be a much welcomed chance to catch up face-to-face –some of us have been tutoring with the OU for 15 years already! It’ll be an action-packed day: Jing, wikis and HEA membership apart from the usual updates on the nitty-gritty of life as an AL. My slot will be about OER and flipped learning; one thought often occupies my mind these days and it is how the flipped class can be better if teachers flip with OER, so that’s the question I’ll be putting to my colleagues, ‘what does OER use bring to the flipped (language) classroom?’ I was asked to send a handout to be photocopied and handed out (obviously!), but I think sharing here is better –it saves a couple of trees from the chop to counterbalance my embarrassingly high CO2 imprint.

See my slides below: an overview of the OER Research Hub Project, the OER Impact Map and flipped learning as prelude to, I hope, an open discussion on the benefits (or not) of OER-ing.

Flipped Learning and OER: Survey Results

Yesterday the Flipped Learning Network (FLN) announced a formal definition of ‘flipped learning’, a timely reminder for me to share the results of the survey that the OERRHub Project conducted with flipped educators to find out about their use of open educational resources (OER). I blogged about our relationship with the FLN and how this research piece came to be at the launch of an infographic featuring some of these data; here comes a recap and more.

Sample

109 U.S. teachers practising flipped learning in their K12 classrooms completed the questionnaire. A majority (68.8%, n=75) have over ten years of teaching experience but have been using the flipped model for two or less than two years (91.7%, n=100). Subject-wise Science (37.3%, n=40) and Math (32.7%, n=35) top the poll, followed by Social Studies (20.6%, n=22) and World Languages (10.3%, n=11). More teachers are flipping their classrooms in the higher grades –67.8% (n=74) in K9-12 as opposed to 8.2% (n=9) in K2-5. Most teach in suburban schools (63.2% n=67), while rural and urban communities figure less often, 20.7% (n=22) and 16% (n=17) respectively. 61.5% (n=67) of respondents are based in districts where the percentage of students on free or reduced lunch is below 50%. This profile is consistent with that reported in June 2012 following a previous survey of 450 flipped educators, on which we based our demographic questions (see Improve student learning and teacher satisfaction in one Flip of the Classroom). In other details, we learn that a huge majority of these teachers have accessed the internet at home using a broadband connection (98.2%, n=107); they present their work at staff development events (74.8%, n=80) but are not in the habit of publishing their teaching presentations publicly online (11.2%, n=12).

To use or to adapt, is that the question?

Not really, the percentage of teachers who say they adapt resources to fit their needs in a flipped classroom (82.5%, n=90) is much higher that those who report using resources as they find them (17.4%, n=19). The pattern is repeated across grade levels (see chart below). Note that only 2 teachers in K2 answered the survey.

USEorADAPT

Types of resources

It does not come as a surprise that 95.3% of respondents select videos when asked to indicate the types of free online resources they use in the flipped classroom; after all, moving instruction outside the group learning space into the individual learning space, the essence of a flipped class, is most commonly done by means of recording videos and asking kids to watch them as homework. The percentage of teachers who report they use images is even higher (96.2%, n=102) but other online resources slip down the ranks: 56.6% (n=60) use interactive games, 51.9% (n=55) use tutorials and 46.2% (n=49) quizzes. What is interesting is that when compared with the types of resources flipped educators create, images drop to 33.3% (n=30) while videos keep their high position (83.3%, n=75).  This clearly suggests that teachers are following the advice from flipped class pioneer Jon Bergmann that making your own videos helps teachers establish a relationship with students. What I find puzzling is that in an age when it is so easy to snap and share, these teachers are mainly consumers and not producers of images.  Another interesting point was noted by Kari Arfstrom, Executive Director of the FLN, during her visit last September: the difference between teachers who create and use quizzes is slightly skewed towards production, 57.8% (n=52) against 46.2% (n=49), perhaps an indication that they prefer to set their own tests, as they have a better overview of how students have responded to the content taught.

Types Used vs Adapted

Creating OER?

43.3% (n=42) of teachers in our sample report that they create resources and share them publicly online; however, the number of teachers who say that they create resources and share them publicly online under a Creative Commons license drops significantly to 5.1% (n=5). In connection with this, 47.2% (n=46) say that they are familiar with the CC logo and its meaning, which still leaves an important proportion of respondents declaring that they either have not seen the CC logo, or they have seen it but do not know what it means. As for open licensing, 70.1% (n=68) consider it is important or very important when using resources in their teaching.

Repositories and sharing

The chart below shows the most common repositories accessed by flipped educators in our sample when looking for free online resources. YouTube, YouTubeEdu and YouTubeSchool (93.4%, n=99), together with TED talks (66.9%, n=71) and iTunesU (54.7%, n=58) unsurprisingly rank amongst the most popular. Slightly unexpected in my opinion is how little knowledge there is of major repositories of resources for the K12 sector –note, for instance, CK12 (16.9%, n=18) or Curriki (7.5%, n=8).

Repositories

What do teachers do when they access these repositories? The most common behaviour is downloading resources, 81.4% (n=70). Then 38.3% (n=33) say that they have uploaded a resource, 30.2% (n=26) that they have added comments regarding the quality of a resource, and 15.1% (n=13) that they have added comments suggesting ways of using a resource. The most popular way of sharing is via email (89.5%, n=94) and in person (67.6%, n=71).

Purposes of using OER

The chart below shows responses to the question For which of the following purposes have you used free online resources in the context of flipped learning?

Purposes

Challenges to using OER

When asked about the challenges that they most often face when using free online resources, 70.4% (n=69) of respondents in our sample indicate that they do not have enough time to look for suitable material;  for 65.3% (n=64) it is finding resources of sufficient high quality, and for 59.1% (n=58) finding suitable resources in their subject area. Equally interesting is what they consider the smallest challenges: not knowing how to use the resources in the classroom (7.1%, n=7); lacking institutional support for their use of free online resources (13.2%, n=13); and resources not being aligned with professional standards or regulation (14.2%, n=14). If anyone doubted the technical abilities of teachers, only 15.3%  (n=15) think that not being skilled enough to edit resources to suit their own context is a barrier to using OER. And a personal favourite of mine: a skimpy 29.5% (n=29) declare that not knowing whether they have permission to use or modify resources would actually stop them from using them. If we imply then that roughly 70% know when they are allowed to adapt a resource (which is consistent with the number of teachers who say they care about open licensing), how come the percentage of those aware of Creative Commons licenses is much smaller?

Impact on teaching practices

93.8% (n=91) of K12 teachers in our sample agree or strongly agree that as a result of using OER in the flipped class they use a broader range of teaching and learning methods –indeed I would argue that this is exactly what the flipped model facilitates: what is the best use of my classroom time? Not lecturing at students for forty minutes but opening up the space to different, more engaging and participative ways of learning. 89.7% (n=87) agree or strongly agree that they make use of a wider range of multimedia, and 88.6% (n=86) that they reflect more on the way that they teach. Where K12 teachers say their use of OER has had the least impact on their teaching practice can be listed as follows:

  • I make more use of culturally diverse resources (51.4%, n=49)
  • I have more up-to-date knowledge of my subject area (69%, n=67)
  • I more frequently compare my own teaching with others (70.1%, n=68)
  • I collaborate more with colleagues (70.1%, n=68)

Impact on students

91% (n=81) of teachers agree or strongly agree that their use of OER allows them to better accommodate the needs of diverse learners, a thought that resonates with the benefits of flipping the class: teachers have more one-to-one time with students, and students are often allowed to progress at their own pace.  It has been previously recorded (see for instance A Case Study: Flipped Learning Model Increases Student Engagement and Performance) that flipping the classroom makes for happier students; flipping with OER does not seem to deviate from this: 85.2% (n=75) of K12 teachers agree or strongly agree that their use of OER in the flipped class increases learners’ satisfaction with the learning experience. In third spot comes a statement that invites to think of teaching as forging personalities: 81.8% (n= 71) agree or strongly agree that their OER use helps develop learners’ increased independence and self-reliance. At the other end of the spectrum, only 42% (n=37) agree or strongly agree that using OER has any bearing on students at risk of withdrawing actually continuing with their studies; 52.8% (n=47) that it leads to learners becoming interested in a wider range of subjects than before; and 59.5% (n=53) that it increases their enthusiasm for future study.

I teach, therefore I reflect (and change)

During the few weeks ahead of OpenEd13, in preparation for my talk, I spent time interviewing K12 teachers in the US about their use of open educational resources (OER) in the classroom. As part of my work with the OER Research Hub Project I’m researching the hypothesis that OER use leads educators to critically reflect on their teaching practice. What follows is a thread to weave my slides below.

My personal view has always been that teachers reflect all the time, OER-ing or not: we (and I throw myself in the mix) may not keep a diary, or pause and think hard about how the class/tutorial/seminar went, but in the thick of it we know what’s working, what’s not, what needs to be fixed: adapt, colour, reshuffle, attack from a different angle or dump. We are constantly on the lookout for ideas to teach better, to engage students better, to help them learn better. Take, for example, flipped educators: the OERRHub survey in the spring of this year shows that a majority of respondents has over ten years of teaching experience but has been flipping the classroom for less than two. What moves an experienced educator to try something as bonkers as shifting direct instruction from the group’s learning space to the individual learning space and leave herself with forty minutes waiting to be filled with bags of creativity? It’s not because she hasn’t been doing a good job so far, but because word out is that flipping the classroom works. It is through reflection that we become agents of change. My conversations with teachers from the project’s two K12 collaborations –Vital Signs and the Flipped Learning Network, evolve around one question: How has your use of OER changed the way you think about teaching? In a sliver of stories of change (or not) that I have yet to analyse, I give you the voices of an English teacher resisting the all-knowing Oz of her past; a Math teacher who basks in bringing multiple perspectives into the classroom; a Statistics teacher who requires his students to co-create the curriculum because it belongs to the world; and a Math and Social Studies teacher who uses a science program because it makes learning real for her kids. My hunch is still there: OER use doesn’t necessarily make better teachers; it’s just that the door to resources is wider than it used to be.

The Lobster Connection

Lobster Car Reg

Friday, May 3rd, 2013. I’m in Scarborough Middle School, Maine, US. A banner across the entrance hall reads ‘You Are Now Entering THE KINDNESS ZONE’, a caution for bullies to take a break, I’m told. I’m here with Sarah Morriseau from the Gulf of Maine Research Institute to visit Mrs. B’s science class. The kids have been working with Vital Signs for the past week and today we are going on a field trip in search of the hemlock tree’s least welcomed resident (at least in these latitudes), the Woolly Adelgid. On the whiteboard Mrs. B has written a note on team jobs: one photographer, two species specialists and one data manager. Kids buzz around while she reminds everyone to be respectful of nature: ‘We are not destroying anything’. And out we go, pass the school’s sport grounds, beyond the pond and into the forest armed with species identity cards and datasheets. Forty minutes won’t count for much if your mind is not on the job. Kids quickly scatter looking for hemlock trees first, then tiny nymphs. We found it! Are you sure? Check against the ID card. Where’s your evidence? The mobile phone comes out. Take a photo. You hold it. That’s great.

Vital Signs ID CardOn Monday I’m back in the classroom. On the way the taxi driver has assured me that the next president will certainly be a latino; that’ll be interesting, I think. Another glorious day weather-wise, pity we are staying indoors. Today the reminder on the whiteboard is for kids to check their datasheets: No blanks! warns Mrs. B. Team Wolverine’s data manager seems to struggle to fill the space about what happened when they were collecting data. The blank in question: ‘I am happy because…’ Are you not happy that you found the species?, I ask. ‘Well, yes but I’m sad too…’ comes his reply. ‘At least now you know where it is!’, me, always the arch-positive. ‘Oh yes!’ The pencil rushes. I wonder have I interfered with human history…

This is Vital Signs in action. It’s hard to imagine a more enjoyable research trip, really. The OER Research Hub project is collaborating with the Gulf of Maine Research Institute, in particular looking at Vital Signs, a citizen science program for middle school students in Maine.  The project is funded by Hewlett and its content released under a under a Creative Commons Attribution (CC-BY) 3.0 License. The idea is genius. Scientists, keen on mapping the extent of invasive species in Maine, propose a mission on the Vital Signs website, i.e. Where is the hemlock woolly adelgid? Where isn’t it? Teachers organise kids to collect data that are then published online and verified by a species expert. Sarah Kirn, Vital Signs Program Manager, explains it better: “The emphasis is not on finding a species but on the evidence that you are collecting to back up your claim; (…) in science, just as important as what you don’t see is what you do see, and that’s not an intuitive point for kids. So (…) you get one of these species cards, go outside, you look to see whether you’ve found it or not, you make a claim, I did find or I didn’t find it and you back it up with evidence. You collect all that information and upload it to the website. You have to do a peer quality check first: students working in teams trade computers, each team look at each others work (…) What’s the quality of the statements that you made? Have you been very thorough and precise about your language and in describing things? Did you take the photographs that capture the most important characteristics of the species that you were looking at? (…) In the evidence and the critique there’s a lot of opportunity to work on critical thinking and reasoning.” Vital Signs is about learning science by doing science. ‘My kids are real scientists and this is real science that we are doing’, says Mrs. P. in Dover-Foxcroft – Se Do Mo Cha Middle School. And she adds: ‘I don’t care if they can identify a dragonfly. What I care about is that they can follow the process trying to identify what bug it is that we found that day (…) Ten years from now it doesn’t matter if they can still identify a dragonfly, but it does matter that they remember the scientific experience and how positive it was for them’.

You can listen to an edited version of my interview with Sarah Kirn here.

On my first day in Portland, I visit the Cohen Center for Interactive Learning at GMRI, where 5th and 6th grade students come for a fantastic hands-on science LabVenture. In one of the tanks the blue lobster has molted, which apparently makes it more vulnerable to other tank inhabitants with great appetites. Judging by what’s on the menu in most restaurants in town, better in than out, I’d say.