Thursday, June 21, 2012

iPads in Education: How and why? Or why not?

Since the iPad was introduced in 2010, it - along with other tablets - has been increasing in popularity both inside and outside of classrooms. To many it is seen as a very exciting educational tool that opens the classroom to a whole new set of opportunities.

I myself have owned an Android tablet and recently switched to an iPad 2. The major impetus to my switch was because of the popularity of the iPad in education. I wanted to be familiar with the device that is changing the landscape in many schools.

I know that schools and school divisions are buying iPads, and I want to know:
  • what is the goal behind buying iPads for classroom use?
  • how are iPads being used in classrooms?
  • is there any data to support their use?
  • are they being used in higher education settings?
To explore these questions, I found a report put together by Alberta Education. It can be accessed here: http://education.alberta.ca/media/6684652/ipad%20report%20-%20final%20version%202012-03-20.pdf

What is the goal behind buying iPads for classroom use?
  • to provide all students with the opportunity to succeed - helps to differentiate
  • to increase student engagement
  • to meet every students' needs every day


How are iPads being used in classrooms?
  • beneficial for dyslexic students because font size can be increased
  • writing through story apps rather than just pencil and paper writing to engage more students in the process
  • used as an assistive technology - positive because it is the same technology used by the other students
  • promote self-efficacy through customization features
  • promote risk-taking - eg. students who often are reserved all willing to share their creations on the iPad
  • bridging the literacy gap - "disabilities disappear" with the iPad in the students' hands because it offers a variety of media rather than just written
  • Sign 4 Me app to facilitate communication between a deaf student and his classmates
  • translate work into other representations
  • language acquisition
  • assessment with a faster feedback loop and more differentiation - as and for learning


Is there any data to support their use?
NOTE: The Alberta report recognized that they have no actual quantitative data showing that iPads increase students outcomes or scores
  • this article http://stateimpact.npr.org/indiana/2012/02/24/do-ipads-really-boost-test-scores/ cites two studies on iPad use in classrooms
    • kindergarten students in Auburn, Maine who used iPads scored better on every literacy test
    • 78% of students who used the HMH algebra app scored proficient or advanced compared to 59% who used the textbook version
  • the article above questions whether or not it is really the iPad making the difference, or if it is rather the teacher who is more engaged by being a part of the study, and therefore, has improved their teaching


Are they being used in higher education settings?
NOTE: The Alberta report only discussed K-12 schools.
  • I found a few articles discussing iPad use in higher education. Here is an example: http://www.onlinecollege.org/2011/11/18/evaluating-the-ipad-in-higher-education/
    • used in a variety of ways including:
      • providing iPads pre-loaded with class texts and required applications
      • to improve engagement and to allow faculty to explore new teaching methods
      • to integrate specific apps, such as Wolfram Alpha

Overall, I am a little bit disappointed with what I have found. The Alberta report was a very good read, but provided very little in terms of actual strategies or data to support their use. I was happy to hear about all of the ways the iPad can support students with disabilities, but I was hoping to read more about how it is being used by the "average" student. The information I found is all very positive on the use of iPads, but at this point in time I don't think I am convinced enough to say to a school or division that they should invest thousands of dollars to implement a 1-to-1 system. I would definitely recommend it as a support tool, but not a tool to put in the hands of every student based on the high cost.

I would need to do further exploration to see if any schools have tried implementing cheaper tablets, such as the Kindle Fire, and whether or not similar results were found. The report was very adamant that the touchscreen was a huge reason why the iPad is such an engaging tool, so based on that I would assume that other tablets would be just as engaging. The app library is much different, though, and many are perhaps less user-friendly for children. These tablets also lack the allure of the iPad, which may be a contributor to the increased engagement. Food for thought, anyways.



Math Education and Technology: Interview with Nathan Banting

There are two math teachers in Saskatoon that I have had the opportunity to get to know and that I really admire, and to be honest, a huge part of why I admire them is because of their involvement in the #mathchat Virtual Learning Community (VLC). I have never been in either of their classrooms, but from their work I have seen online, I have an appreciation of their teaching and dedication to their craft.

The first of which is Michelle Naidu, who I previously interviewed in my post Flipped Pessimism: What the opponents are saying and the second is Nathan Banting (http://musingmathematically.blogspot.ca/). Every time I have met Nathan, I have been blown away by his thoughtfulness, his love of math and his love of engaging learners! I tried to set up a video interview with Nathan, but that didn't work out. Fortunately, I was still about to get his thoughts regarding flipped teaching, his teaching strategies, his thoughts on technology in math education and his experience with VLCs.

Ryan: I know Flipped Teaching is something you haven't really explored. From what you have heard, what do you think of the concept?

Nathan: Flipped teaching still worries me. I think mathematics education should focus on the broad themes that make it accessible and practical. I am worried that teachers will abuse the videos and then use their class time as a glorified study hall. It is very possible to flip your room, and make no fundamental shift in your teaching. On the other hand, building the atomic skills with good videos would benefit a teacher trying to use class time to work on deeper problems. If we can harness the foundational skills for homework, we can then begin to apply them in meaningful ways with our contact hours. Too often, online videos are used as digital lectures; the Khan academy presents a flagship in this regard. If we can use digital media as a starting point rather than the end-goal, I think assigning a flipped homework package could prove beneficial. There are also the issues of accessibility and student effort. They really have no bearing on the pedagogical issue at the root of the question.

Ryan: You have been doing great work using "Problem-Based Learning" in your secondary math courses. Can you provide a brief summary of how you are employing this method?

Nathan: Problem Based learning (PrBL) is a system where topics are examined within the midst of a larger issue. It can be a situational problem that provides context, or a fabricated one that relies on a base of mathematical skills. In essence, the task is given and curricular math is its underpinning. Good problems force students to make mathematical decisions. They then must understand the consequences of those decisions. Problems range in complexity; a problem may take 30 minutes or 2 classes. Some introduce topics, some cement the learning. Often, they are solved in “think-tanks” of students working to arrive at a solution. PrBL is designed to allow students to chew on a topic without rushing through a pre-set algorithm to arrive at an answer neatly printed in the answer key.

Ryan: Are there certain topics where you have been unable to employ this method? If so, how did you approach those topics?

Nathan: I have developed the entire Workplace [and Apprenticeship] courses (10 & 20) around problems and projects. The practical topics allow for me to choose relevant situations and tasks for the students. I find it tough to implement large scale projects into an abstract course. In my experience, students are intimidated by its unfamiliarity. PrBL does fit nicely into all streams. Presenting a thought provoking entry event can begin to build understanding across the board. Whether it is getting students using graphing software to graph their first quadratic with a TOV [table of values] or asking them to design a data set with given central tendencies, providing an open environment for them to make connections is key. In the higher levels, more direct instruction is given, but anchor problems are a great reference for teacher and students.

Ryan: In your mind, what is the role of homework in a secondary math classroom?

Nathan: I think homework needs to be done to build a toolbox. Every student mathematician needs an angle to approach challenges. Those angles always necessitate a mathematical arsenal. Simple operations may help them pick fair teams where more complicated means (factoring, trigonometry, etc) may open doors to more novel and elegant solutions. Homework exists to practice pieces; the sad part is that most students never get the opportunity to use those pieces in a larger scope.

Ryan: In what ways are you currently using technology in your teaching?

Nathan: In my Workplace PBL classes, students have full access to the internet and all the software it provides. I use it to create individualism and autonomy. It switches the focus of the math class. Students are now expected to create a pathway to a solution; they need to show me that path. The formula and algorithm are not really the focus anymore, because they are all readily available. In other courses, graphing software is used a visualization tools and online centres are set up for group collaboration. Some of the best lessons use simple technologies. A set of dice, a cylindrical can and a utility knife, a magic 8-ball, coloured envelopes, protractors, etc. I think teachers have lost sight of the usefulness of this technology. I have an IWB [interactive white board], but the moving of a metre stick often shows the breadth of angles more efficiently.

Ryan: If you had an unlimited teaching budget, how would you use technology in your teaching?

Nathan: Unlimited budget is a dangerous thing. I think I would have two answers for 2 separate classes. W&A) I would ask for a laptop computer for every student. Accompanied with this would be full licences to Microsoft suite as well as zero administrator passcodes. I want students to be able to search out appropriate software to solve their problem. Mice need to replace trackpads, and even a touch pad where students can write in their thoughts to create a digital portfolio. Technology needs to encourage students to document their process; I find word documents don’t accomplish this feat.
For the other strands, I would ask for a set of tablets. I think the portability of the machines make them attractive. Students could have a variety of apps at their fingertips. Unit converters, calculators, graphers, simulators for dice and other probability games. Conjunction with geogebra would provide a very tangible look to functions. Graphing and posting with ease. I would eliminate graph paper altogether. Students could have an all-in-one collaboration station, but they also work great in isolation. There would have to be one for each student; they log in, use it for the class, and dock it for the night.

Ryan: You are a part of a Virtual Learning Community on Twitter and also as a blogger that discusses math education. How has being a part of that community improved your teaching?

Nathan: Edublogging began as a personal documentation system, and has become so much more. It provides an authentic audience that our students so desperately need. Twitter is the single best decision of my teaching career. It allows me to see what other distinguished educators are doing quickly and effortlessly. It starts the wheels turning, and provides a support system throughout the process of implementation. I cannot stress enough the importance for teachers to be digital citizens. Unless we become one, we will never understand our students as digital agents.



As you can see, Nathan is doing exceptional work and is a leader in Problem Based Learning in our province. I hope you have enjoyed reading his responses and are left feeling inspired that this individual is teaching our youth! He has left me with a few things to chew on and consider as I move forward as an educator and a designer.

Follow Nathan on Twitter @NatBanting to get updates and links to his work.

Wednesday, June 20, 2012

My new E-Word: Trying to make sense of it

I recently learned about a word that I never really considered before. I had heard it in passing, but never took the time to explore it. It's very long and funny sounding, so I just glossed over it. The word is Epistemology. What is it? What does it mean? Why should I care? And how does it affect my practice?

Please, note that this is new to me and I may be way off base with some of this. Please, discuss any errors in the comments below - of course, I could only be wrong if knowledge is external to the learner!

What is it?
To me, epistemology is basically a worldview. Maybe more specific than a worldview. It is basically how one views the construction of knowledge in the world. 

What does it mean?
There are three main epistemologies: they are objectivism, pragmatism and idealism. Again these are all words that I knew, but never really took the time to explore, especially pragmatic and idealist. 

Essentially an person with an objectivist espistemology believes that knowledge is real and exists in reality. As a learner we learn by experiencing and understanding the content that is out there. I compare this to objective questions on an exam; there is a correct answer out there that can be achieved.

I will jump to the other end of the spectrum to idealists. Idealism could also be called subjectivism. The idea behind idealism is that the learner constructs knowledge. The way I picture this is that there is "stuff" in the universe and an idealist would say that each person interprets and makes sense of this stuff differently. I relate this to scientific models; for example, in science we make theories to try to explain things that we can't see or experience. An idealist would say that every person does this with everything they experience; the learner has an experience and makes their own meaning or model to understand it, but the crazy part is that there isn't a correct model - everyone's model is their own reality. 

Pragmatism falls in the middle of idealism and objectivism. Pragmatism states that there is an external reality out there, but we can't experience it directly. We still interpret things in order to make meaning, but there is meaning out there and it is subject to change.

Why should I care?
Good question! As an educator, it is important to care about this because it really shapes our teaching. If we are hardcore objectivists, then that will show in our teaching. I think the roots of instructional theories were all based on an objectivist epistemology. Eg. "There is knowledge that we as teachers know and we need to fill your heads with it." If, on the other hand, the teacher is an idealist, then that would have major implications on the instruction and evaluation. How do you evaluate if you believe that all knowledge is created and relative to the learner? These have huge implications to our practice!

How does it affect my practice?
Those of you who have been reading this blog, know that I have been focusing on blended learning and more specifically flipped teaching. Where does flipped teaching fit into an epistemology? I would have to say that at its very nature, flipped teaching comes from an objectivist epistemology. If I don't believe that there is specific knowledge out there that I can transmit, then I am wasting my time making a video. 

That notion makes me uncomfortable because I don't like the implications connected to being  totally objective. Can flipped teaching be used in another way? I tend to feel that no, it can't. Flipped teaching is objectivist, BUT - I am so glad there's a but - flipped teaching can help open the doors to deeper learning as explored in my post Flip This! The big picture. It is when the students have time to explore big problems and issues that they will be able to "interpret" the external reality and move more towards pragmatism. 



Here's a big question that I will leave with you: 
How do our students' personal epistemologies affect teaching and learning in our classrooms? 

Sunday, June 17, 2012

Flip This! The big picture

In my explorations of flipped teaching, I came across this outstanding post by Catlin Tucker called "Flipped Classroom: Beyond the Videos". In reading this post and the comments that followed, I was presented with some interesting thoughts about the flipped classroom.

One idea that stuck in my mind came from Expat Educator in her post http://expateducator.com/2011/12/29/can-all-classroom-lessons-be-flipped/. She states that "...educators [need to] start talking more about Flipped Lessons than Flipped Classrooms". This may seem like splitting hairs, but I think it is an importnat distinction. Teachers don't have to, and probably shouldn't, use the flipped format for all lessons. You don't have to flip everything, but rather flip specific lessons that are appropriate to flip. I myself made this distinction when teaching this past semester. There was some content that I was comfortable having students view at home, but some content was much better to explore within the classroom. 

Another idea that stuck out came from Catlin Tucker's post, she pointed out that Ramsey Musallam defines “flip teaching” as “leveraging technology to appropriately pair the learning activity with the learning environment". This is a much different definition than most people would use for flipped teaching. It is important to note that in this definition, Ramsey doesn't mention anything about videos. Videos will not always be the most appropriate technology, and sometimes it may not even include a computer. 

She also stresses that the goal of flipped lessons should be to shift from "consumables" to "produceables". In today's society, students need to be able to produce in order to be successful. Unfortunately, flipped teaching usually focuses on the consumable portion of the lesson. 

Catlin argues that in order to focus less on the videos and move more towards "produceables" we should:

1. Take advantage of the ready-to-use content available.
2. Don’t just show them.
3. Use the flipped model to create a student-centered classroom. 

My responses to these statements are:

1. I used my own content, but I do see value in using ready-to-use content. Many teachers are hesitant to use the method because of all the video creation involved. The nice thing about your own content is that it all comes from the perspective that you want to portray. As an educational technologist, I now recognize added benefits to using other content because it is an opportunity to teach media literacy skills to students alongside the lesson content. 

2. Involving higher order thinking skills into the at-home portion of the lesson is great, but this now leads back to the issue of students struggling to complete homework without a support network. Perhaps online discussions could be embedded into the at-home portion, or the students' struggles with that at-home portion could be used as part of Just-in-Time Teaching (see http://ryanbanow.blogspot.ca/2012/06/just-in-time-teaching.html). 

3. The flipped model must be used to create a student-centered classroom. If not, then it is not really changing anything. Jackie Gerstein's post The Flipped Classroom Model: A Full Picture does a great job exploring the full picture of flipped teaching. Her post includes a great visual that shows the four parts to a great lesson and shows that the at-home Concept Exploration portion is just one small part of the big picture. 



A good lesson begins with an experiential engagement, then concept exploration (the flipped portion), students' meaning making and then leads to students demonstrating and applying. If we just focus on the video-viewing portion and not spending time on the other three portions, then we are not “leveraging technology to appropriately pair the learning activity with the learning environment". 


The tone of this post may sound like I am talking down on flipped teaching. This is not the case, but rather I am critically exploring all sides of the method. I feel like perhaps I have jumped on the bandwagon without thinking enough about the "produceables" and the big picture of flipped teaching.






Saturday, June 16, 2012

Just In Time Teaching

Last post I explored the meaning of Blended Learning and decided that the definition needs to be fluid if we want to encourage instructors to move towards a blended approach - http://ryanbanow.blogspot.ca/2012/06/call-for-fluid-definition-of-blended.html. Another term I have been wanting to explore is "Just-in-Time Teaching" (JiTT).

The best definition I found for JiTT is from the Just-in-Time Teaching Digital Library at http://134.68.135.20/jitt/what.html. G. Novak defines JiTT as: 

a teaching and learning strategy based on the interaction between web-based study assignments and an active learner classroom. Students respond electronically to carefully constructed web-based assignments which are due shortly before class, and the instructor reads the student submissions "just-in-time" to adjust the classroom lesson to suit the students' needs. Thus, the heart of JiTT is the "feedback loop" formed by the students' outside-of-class preparation that fundamentally affects what happens during the subsequent in-class time together.

I really like the idea behind this definition, but I recognize the struggles with implementing it. As teachers we love to have a lesson plan in place often well in advance, but JiTT forces us to think on the fly and design instruction at the last minute. The last minute is key to this method being effective because as the teacher you want to know exactly where the students are at when they enter your classroom. Using the web to collect this information helps quickly collate the data from everyone rather than just from the few students who will speak up in class. I think this data would be best kept anonymous.

How do you motivate the students to complete the questions if the data is anonymous? Student motivation would be a major stumbling block to putting JiTT into practice. Unfortunately, many students want to only do work that is for credit. I think the way that we get around this issue, is by making sure that the activities we have the students complete are meaningful and important for their learning. There may not be a score or number attached to it, but if students can clearly see how important completing these web-based questions are in the actual lesson planning, then they will be more inclined to do it. Therefore, the ball is back in the instructor's court. If you are asking students to do these questions, then you need to actually use the data and not just present the same old lecture. 

Student response systems would be another example of an opportunity to use JiTT. Tools like SMART Response (http://smarttech.com/response) are becoming more and more popular these days, and we as educators need to meaningfully use the responses that these devices collect. This is hard to do! I have been having students answer multiple choice questions during my lessons through the use of response cards for the past few years, and it is great when you see that the entire class gets the question right, but what do you do when they all get it wrong? What about when only a handful of students get it right or wrong? This requires major flexibility as educators because you can never really predict the responses. You can't simply move on; it needs to shape your lesson going forward. You also need to ask questions that go deeper than just basic knowledge in order to find out if the students actually understand the content. These questions are hard to ask in a multiple choice format!

Here's a scary thought: When using JiTT, what do you do when the technology fails? As instructors we are depending on student responses to shape our lesson. What happens when the data doesn't show up?! Do we need a backup lesson in our pocket? Thanks to Faron Hrynewich for bringing this to my attention in his post http://faronatetad.blogspot.ca/2012/05/mobile-learning-disuptive-innovation-or.html.

Thanks for reading! If you have ideas for different ways to implement JiTT or thoughts on this topic in general, please share them below.



Wednesday, June 13, 2012

A Fluid Definition of Blended Learning

"93% of higher ed instructors and admin say they are using blended learning strategies somewhere in their institution. 7 in 10 expect more than 40% of their schools’ courses to be blended by 2013."Bonk, C. J. & Graham, C. R. (Eds.). (in press). Handbook of blended learning: Global Perspectives, local designs. SanFrancisco, CA: Pfeiffer Publishing.

Blended Learning is a very popular phrase these days in education, but what does it really mean? It seems to have a really varied definition depending on who you talk to and where you look. 

There seems to be three main types of courses these days:
  • face-to-face
  • online
  • blended
Face-to-face are courses where the instructor and students meet regularly for classes. Online classes have all of the content delivered online. Blended courses feature some sort of blend between the two. In most cases it would be a course that offers some face-to-face interaction and then other portions online. 

This is where the questions creep in about Blended Learning. What do you have to do online for it to actually be a blended class? Is posting course notes online for students to read blending? Is posting videos blending? Is having an online discussion board blending? Is having students explore content through computerized simulations blending? What about having students complete surveys and questions online? What if you only do one of these things, would it be considered a blended course? I don't think there is one definition that you can find and use to discern whether or not something is blended. 

The lines between blended and not blended will also be dependent on the person who is defining it. For example, someone with a high level of media literacy or experience teaching with technology may not consider delivering a course face-to-face and then using an online discussion board sporadically to be blending, whereas someone who is using a discussion board for the first time in an educational setting will undoubtedly say they are blending the course.
 
As an educational technologist, I think I need my definition of blending to be fluid as well. To educators who are just beginning their journey with blended learning, I do not want to portray blended learning as something that must contain a discussion board, online content delivery and synchronous online elements. If I start with that definition it will likely scare them away from the idea entirely. I want to begin with a smaller definition to allow them to begin the journey. I want them to step outside of their comfort zone just enough to discover the benefits of blended elements and then we will work to push the boundaries even further. As they move along, I will fluidly change my definition of blending to continually raise the bar. 

A friend passed along an infographic from http://edudemic.com/ on blended learning that explores even more types of blended learning. Click HERE to get a good look at it.
Edudemic.com
One of the models described on the infographic that stood out to me was the Flex model. This model of blended learning has all of the content delivered electronically, but the instructor is available for extra support as-needed. I don't think that fits most of our definitions of blended learning, but goes to show how widely-used the term is.

In summary, blended learning means many different things to different people, but that is OK. As educational technologists we need to support any use of "blended learning" and work to bring educators along on our journey of using blended environments to improve the learning experiences of students.

Do you agree or disagree with this? Should we have one clear definition? Please, leave your thoughts below.

Monday, June 11, 2012

More Than a Feeling: In Search of Hard Evidence

After my last post, I was challenged to think about convincing university faculty to apply Flipped Teaching. This is something I plan to do because I have seen both first-hand in my grad courses and also as a teacher that flipping a class has many positive effects. Unfortunately, these effects are just a "feeling" for me and I don't have any hard evidence. So I went on a journey to find some. 
I searched online and found http://flippedclassroomdata.blogspot.ca/ which has a collection of data about flipped teaching. After parsing through the data, I found that there was no data that showed student learning outcomes.
I then found this post on the University of Wisconsin-Madison website called How 'Flipping' the Classroom Can Improve the Traditional LectureThe article originally comes from the Chronicle of Higher Education. Reading through this article I was unable to find any solid numbers - which is too bad because I am a bit of a mathematician - but I did find some valuable information.
"Research by Ms. Rhea and two colleagues suggests that Michigan's teaching methods have led to greater gains in conceptual understanding. The techniques have been lauded by the Association of American Universities, among others.
In 2008, Michigan gave concept inventories to students before they started calculus and after they finished, and calculated the difference relative to the maximum gain they could have made. Students in Michigan's flipped courses showed gains at about twice the rate of those in traditional lectures at other institutions who took the same inventories.
The students at Michigan who fared worst—a group of 12 who were at risk of failing the course—showed the same gain as those who demonstrated the largest increase in understanding from traditional lectures elsewhere."
I looked for Ms. Rhae's actual research, but was unable to find it. The results stated here are quite impressive. 
  • Gains in conceptual understanding at twice the rate 
  • The worst students showed as good of gains as the largest increase in traditional lectures
If you read the article, you will notice that there are some challenges with measuring student gains. Students are not likely to show much gains in basic knowledge, but it is when you start to look at their deeper understanding of concepts that you will start to see a difference. In order to truly measure the gains you have to use a high quality assessment tool.

For faculty there are downsides to using a flipped model. The same article states that:

"Harvard colleagues have tried flipping, [...], but few have stuck with it. It demands that faculty members be good at answering students' questions on the spot, even when their misconceptions are not yet clear because they are still processing the information.

It can also be very labor-intensive for faculty members who do not have teaching support [...] if it requires a professor to read questions that students submit before class (which is characteristic of just-in-time teaching)...

...But her chief critique is based on the intensity of students' responses. The average score on a student evaluation of a flipped course is about half what the same professor gets when using the traditional lecture."

The downsides for faculty are:
  • forces the professor to be much more flexible and adaptable 
  • requires extra time before classes   
  • student evaluation of faculty is often lower - likely due to the students being uncomfortable with something "different"
These are going to be definite roadblocks when convincing faculty to implement such a model. 
In response, one of the proponents of flipped teaching commented on the student evaluations by saying:
"Liking the class is ultimately beside the point, Mr. Mazur says. He says his results from using peer instruction show that [...] nonmajors who take his class outperform physics majors who learn in traditional lectures.
"You want students to like class, but that's not the goal of education," Mr. Mazur says. "I could give them foot massages and they'd like it."

I continued to search and found some more soft data to support flipped teaching. A press release posted May 15, 2012 on The Business Journal's website called Sophia Survey Finds Student Grades Improve When Teachers "Flip" Their Classroom. In this, I found that "More than 85 percent of teachers who have used the "flipped" classroom model said they saw improvement in student grades, according to a recent study of more than 400 teachers".
From the data I have found, I would say that there are indications that Flipped Teaching does improve student learning. Opponents may argue that the improvement may not actually be from the videos or out-of-class portion, but rather the improvements could be from the new use of class time. Michelle Naidu argued this in my video interview Flipped Pessimism: What are the opponents saying?
At the end of the day, convincing faculty to adopt this model will be a slow process. It will be a tremendous amount of work and a major shift in philosophyIt is imperative that proponents of the method begin collecting really strong quantitative data to show the gains. Without hard evidence, the positives of Flipped Teaching are just a feeling.




Student Thoughts on Flipped Teaching

A classmate of mine passed along this post "Reflecting on the flipped classroom" from Stacey Roshan at The Daily Riff http://www.thedailyriff.com/articles/reflecting-on-the-flipped-class-932.php. This post will be a collection of my thoughts from reading this article.

As I read through Part 1 of the article, there were a few statements that stood out to me. Stacey made the clear statement that I think is sometimes lost on those who are against using Flipped Teaching, which is "In AP Calculus, I have a large amount of material to get through and I'm constrained by the testing calendar." I know this statement is obvious, but at the end of the day it is our reality as teachers. Our classes are full of required content and even though we may think that learning through constructivist methods, such as inquiry, authentic learning and discovery would probably serve our students the best, we simply do not have the time to explore content in that way. Because of the time issue, Flipped Teaching because extremely relevant and useful. 

Another idea from the article that I latched onto was the term "supported failure." As teachers we all know that learning takes place in the failures, and thus we want to structure our classes for students to fail on certain things so it can lead to later success. Roshan argues here that Flipped Teaching is a great way to achieve "supported failure" because students will always be able to run into their road blocks and failures while in the safe environment of the classroom. Her article focuses on AP Calculus students, who as she explains, experience a lot of anxiety and pressure. This anxiety and pressure can be alleviated by allowing the failures to take place in class rather than at home in isolation. 

The second half of the article is where things get interesting (http://www.thedailyriff.com/articles/students-talk-about-the-flipped-class-survey-results-933.php). Actual student feedback is provided. 

As I read through her students' comments there were some that made me go "ah-ha" since I had considered them previously:
  • "The ability to know the amount of time you would need for calc homework" This comment is something that I have overlooked when thinking about Flipped Teaching, but is significant for students. We all know that teenagers are generally busy and in many cases end up pushing homework to the last minute. And then when students sit down to do some homework, they made find that something they thought would take 15 minutes, turns into two hours as they struggle through it. That can be very discouraging and totally throw off an already hectic schedule. A nice aspect of Flipped Teaching is that the students always have a pretty good idea of how long they will need to spend on that class in the evening. For teenagers always trying to cram too much into their lives, this is a huge benefit and may lead to less late nights
  • many of the students commented on the ability to set their own pace with the learning. This was a very common theme of the comments. This speaks to the students' meta-cognition and allowing them to think about how to best structure their own learning rather than the one-size-fits-all lecture in class. Although, the videos only come in one form, the ability to rewind, pause and review seemed to provide students with the feeling of individualization
  • "Sometimes I feel as if the way the class is run causes me to take longer to understand the material than if it was being taught in class." In my experience using Flipped Teaching, this seems to be a common theme as well. Students don't mind using the videos and like not having conventional homework, but at the end of the day they still feel that they would have understood the content more easily with the teacher lecturing in class rather than through video. To me this seems to stem from the students being comfortable with a certain style of teaching and feel discomfort from using a different method. Although, I disagree with the comment, it is something that needs to be considered
  • "Make more review videos as a class" Students want to be a part of the video-making process. This ties back to my previous post on the Student Led Flipped Classroom
  • "I think that the format of the class helped me to get more comfortable working with classmates and asking questions. I got so used to working on math problems at home, and it was nice to have the support of classmates." This is a very positive comment on some added benefits of Flipped Teaching. It allows for students to improve their skills with working together and learning socially rather than in isolation
  • "It didn't help with taking notes/paying attention because if I missed something or was not very focused in watching a video, I could just re-watch the section with no real consequence." This is an interesting comment. The student is saying that Flipped Teaching didn't help to teach them how to pay attention in class, which is a skill that he/she sees as important for post-secondary. Sometimes something we view as a positive can have some unexpected negative results
If you have any thoughts on this or other have links to articles/data that have looked at student feedback on Flipped Teaching, please share them below. It is really important for us to explore the method from the student perspective.


Thanks again for reading.


Sunday, June 3, 2012

Student Led Flipped Teaching

Back in May I presented at SUM Conference on my exploration of Flipped Teaching in my math classroom. At that time a question was posed to me about having the students create the videos rather than the teacher doing it. At the time, I said that it would be a good idea, but one that would be extremely hard to plan.

I read this post by Paul Lehmkuhl at http://mathandedtech.blogspot.ca/2012/06/where-does-student-motivation-come-from.html and viewed David Wees's presentation on "Computer Based Math" available here: http://davidwees.com/content/presentations. David Wees suggests that we have students use computers to create math and he shows a video that students have created to make a problem meaningful. We have also been studying constructivism and motivation this week in ETAD 802. This has brought me back to thinking about student led Flipped Teaching.


What would that look like? How would it be organized?


My thoughts on these questions are that the entire course would have to be structured with this method in mind. At the beginning of the semester, students would be assigned an outcome to study on their own and become the "experts" on. Students would have access to the textbook and any other resources and support materials. At this stage in the process, the teacher would be available to assist the students in becoming experts, but there would be no direct instruction.  


Once the students have become competent with the content, they would create the videos and other resources for that outcome. When the students have finished compiling the resources, then the entire class could proceed through the content. Two different approaches could be taken here: 
  • all students work on the same unit together as a full class group
  • students work through the various units on their own or in group
Now that I have thought about this some more, I feel that it could work. I am still left wondering if this would be a good method for teaching a course. Would students be motivated to construct the content? Would the student created content be up to par? Would the other students be more engaged by the content since it is created by their peers? 

Since this approach would require complete by-in from the teacher to commit the semester, it seems like a bit of a gamble. Has anyone tried something like this? Would there be a better but still similar approach that could be taken?

Thanks for reading!

Monday, May 28, 2012

Flipped Pessimism: What are the opponents saying?

Tonight I was able to sit down and have a Skype conversation with Michelle Naidu (@park_star on Twitter). You can also read her on her blog at http://meandthedoor.wordpress.com/. 


Michelle is a high school math teacher in Saskatoon and has described herself as a "Pessimist" regarding flipped teaching. I am intrigued by this statement, so I was excited to talk with her.




These are the questions I asked her:

1. You call yourself a pessimist regarding Flipped Teaching. Can you explain this statement?
2. Do you see any benefit to using a flipped approach?
3. How do you see technology being best used in math education? 
4. Do you use a wiki as a class textbook?

Unfortunately, I ran into major technical glitches in this recording! I had planned to get into Khan Academy with her, but did not have the time.


Thanks again to Michelle for her time.


Please, post your comments and thoughts below.

Wednesday, May 23, 2012

Flipped Class and Substitute Teachers: The Role of the Teacher

This post comes directly from my experiences in the past week.

My wife and I planned a vacation during the school year, which maybe wasn't the smartest plan, but it was supposed to be three days away from the classroom. Some unexpected things came up and I actually was away from my students for four and a half days of class. This can obviously lead to some learning issues, especially in secondary level math and science where substitute teachers are hard to come by.

I saw this time away as an exciting experiment with flipped teaching. Before I left, I decided that my best plan of action would be to use a flipped teaching method while I was away; therefore, I created videos for most of the examples to be covered in class. This way the substitute would not have to deliver instruction in class, rather they can show my videos in class or ask the students to view the videos at home. In the classroom, the substitute's main task would to encourage students to explore problems, work together and help as best as they can. My vision would be that the students would really band together to reach a high level of understanding.

Before I left, I thought this all sounded great! Unlike other times I have been away, this time there would be different. There would be no excuses. Unfortunately, the results weren't quite as envisioned. The results were probably a bit better than normal, but the level of student confidence did not approach the level I had hoped. Where did this plan break down?

This leads me into thinking about how important the teacher really is in a flipped or blended environment. With only the videos to watch, the students were able to view the videos, but failed to really make meaning of them. When I am in class

  • I spend time preparing the students to view the videos
  • We explore the content together before unleashing the videos on them. This seems to be a really important step in the flipped process
  • When we return the next day, I always recap the ideas from the videos. Again, this seems to be an important step in the students making meaning of the videos. 
  • The next step in the process is the students actively engaging in the content; this is the main goal of the flipped classroom - more time for students to engage in content. 
How effective is this engagement without a teacher to guide the process? Based on this past week, my answer is that it is not very effective. Are there ways I could have improved my planning to make it more effective? Is the teacher vital to the process? Most students work without much if any assistance from me when I am in class, so why are things so different when I am away?
After having taken a couple university-level online courses and now blended courses with an online or face-to-face classroom component, I can see how important the teacher is to the process. At the very least, some contact time with the teacher gives you confidence and helps you feel like you are on track. In my experience, I feel apprehensive with the content until I can hear communicate with the teacher and feel like I am on the right track. Is this the main missing link when using flipped teaching with a substitute? If so, what other ways can we build feedback into our flipped method so that students can feel that sense of confidence without the teacher's presence? Can the substitute's role be to provide that confidence, even if they are not confident with the material?

Flipped teaching seems like a great method to use when you are away from the classroom, but there seems to be vital pieces in the flipped process that are missing without the teacher's presence. If you have had good experiences with flipping a class with a substitute, please share them with me in the comments below. If you have any thoughts on the topic, leave them below.

Thanks for reading!

Wednesday, May 16, 2012

Where I'm at with Flipped Teaching

I first learned of Flipped Teaching in April 2011. When I first heard of it I was excited by the possibilities, but also grounded by the question, "What if the students don't watch the videos? Then what?" I kept the idea in my head as I finished the school year. At the end of the year, I discussed the concept with one of my classes and most students were apprehensive about the idea of flipping math classes. Due to this apprehension, I put off implementing it until I could learn more.

As I went through the next semester and taught more of the new math curriculum, I found that I was having more and more trouble finding time to explore and investigate the content with the students, complete examples for the students that need it and still provide time for the students to practice. The student practice was almost always pushed to the last few minutes of class or completely at home. This led to a lot of students never really engaging in the content on their own, which in my opinion, is when the actual learning takes place. Students need to struggle with the content in order to learn.

Due to this increasing struggle to find time, I decided that I couldn't wait any longer; I had to try to implement a flipped classroom. I began fipping some math lessons in the second semester of this school year. My method of flipping is as follows:

  • we still explore and investigate the content in class
  • the regular lecture style examples are presented in video format for students to watch at home
  • the next day in class we do a quick recap of the content and then students are able to engage in practice in the classroom surrounded by peers and the teacher!


Now when students hit a roadblock in their work, they are able to talk to each other or me to get through it, rather than being alone at home and simply giving up. In terms of timing, we are not spending two days per lesson; instead, we work on practice for the first half of class and explore/investigate for the second half of each lesson. Then students go home and watch the videos and return the next day ready to do practice. Here is a link to one of my class wikis that I use to host the post the videos on: http://mathf20.wikispaces.com

Here's an example video:


Overall, I am very satisifed with my experience with flipped teaching so far. All students are now actually working on assignments in class and are able to get the assistance they need. This is a huge improvement over the past where many students wouldn't complete any questions and home, which meant they were never actually doing any thinking on their own. Of course, there have been questions, issues and concerns that have arisen from this flipped method too. Examples of these are:

  • student and parent concerns over trying something new
  • some students not watching the videos
  • does this actually fit the inquiry based math curriculum?
  • the extra time it takes me outside of class to make the videos
  • student outcomes seem to be improved, but the students this semester are also older and any have already taken another 20-level math course
  • how would I implement this in other subject areas?
  • can the flip be implemented within class time?
  • how do I best use flipped teaching to differentiate instruction?


That is where I am at today. I am using my basic flipped model for most of my classes and I am happy with the results. Some students have expressed how much they like the videos and other students have expressed their disdain for them.

Over the next few weeks in ETAD 802, I plan to research and explore current articles and research on flipped teaching and other forms of blended learning. I will be posting a new blog post every few days.

If you have used flipped teaching, have suggestions for good articles or have questions or concerns regarding flipped teaching, please leave me a comment below. I would love to hear from you and learn with you!

Saturday, February 4, 2012

Reading Report 4 – “Usability Testing: What Have We Overlooked?”

Reading Report 4 – “Usability Testing: What Have We Overlooked?”

Citation:
Lindgaard, G., & Chattratichart, J. (2007). Usability Testing: What Have We Overlooked? CHI 2007, (pp. 1415-1424). Retrieved from http://www.itu.dk/~hulda/Usability/artikler/p1415-lindgaard.pdf


Key words:
Usability testing, Metrics, UEM (Usability Evaluation Method) , participant recruitment, number of users, number of tasks, recruitment of users

This article is about (main discussion):
The article explains research into usability testing to see if “The Magic Number 5” for users is accurate. The research is to find:
  1. That there is a correlation between number of users and the proportion of problems found.
  2. That there is a correlation between number of user tasks and the proportion of problems found.
In order to study these two ideas, the authors did not perform usability tests, rather they were allowed to use the raw data from a previous study from a previous study (CUE-4 from CHI 2003) into the number of users needed. They took data from nine of the usability teams. The nine teams all completed usability tests on the same hotel booking website, but the nine groups did not have the same user tasks or number of users. Lindgaard and Chattratichart analyzed the number of users, the number of user tasks and scenarios, and the number of problems that each group identified (p. 1417).

Upon completing their statistical analysis they found:
  • The first hypothesis that there is a correlation between the number of users and the proportion of problems found was not supported.
  • The second hypothesis that there is a correlation between the number of user tasks and the proportion of problems found was supported.
 The authors argue and state that:
  • There has been too great of a focus in ID on the number of users when the focus should really be on the number of user tasks and also the quality of participants.
  • The Magic Number of 5 does not hold. In this study, “the percentage of problems found by the nine teams ranged from 7% to 43% - nowhere near the predicted 85%. The argument [the 5 users will find 85% of the problems] is therefore not upheld by the present data” (p. 1422)
  • Giving users a persona to imagine during the tasks is helpful because “[t]he persona might have helped their test users place themselves in the real users’ shoes and hence carry out the required tasks the way real users would do. As a result, Team S [the only team with a persona] performed better than expected” (p. 1423)
  • Rather than the number of users, the focus should instead be on “careful participant recruitment […]. It [also] pays off to give many sets of user tasks to a small number of users in a usability test rather than giving many users the same limited set of user tasks in a usability test” (p. 1423).
The authors conclude that:
Less resources and research should be spent on studying the number of users to use for usability testing. Instead the focus of further research should be on “the role of user tasks on improving usability testing approaches as well as into the importance of recruitment of test users” (p. 1424).

References: 16, (1981-2004)

Relating this to our project:
  • When creating our usability tests, we should focus on increasing the number of user tasks
  • The quality of the user tasks and scenarios must also be a consideration
  • It will be more fruitful to give a higher quality usability test to a small number of users
  • Care and attention must be taken to ensure that we have a good variety of users. The users should have varying levels of computer savvy and from different age groups. We should not just focus on using one group of users because they may all find different errors or problems. For example, a more tech savvy user may uncover issues with the usability than a less skilled user. They may approach the slide collection in a much different way
  • It may be beneficial to have the users take on a persona while completing the user tasks

Reading Report 3 – “Usability testing for web redesign: a UCLA case study”

Reading Report 3 – “Usability testing for web redesign: a UCLA case study”

Citation: Dominique Turnbow, Kris Kasianovitz, Lise Snyder, David Gilbert, David Yamamoto, (2005) "Usability testing for web redesign: a UCLA case study", OCLC Systems & Services, Vol. 21 Iss: 3, pp.226 - 234


Keywords: usability testing, library web design, card-sorting, surveys, think-aloud, structured analysis

Summary and Arguments made in the article:
This article describes the process and usability testing that was done to redesign the UCLA Library website. The original website had many issues that were identified by users and also employees of the libraries. Some of the issues included:
  • Different graphics and layout on different departments’ websites
  • Inconsistent nomenclature and heavy use of library jargon, rather than terms that the users would actually use
Usability Testing Methodology Employed
  • Structured analysis – an inventory of all the different library pages were taken and entered into a spreadsheet. This allowed for easy comparison of the different pages.
  • Surveys – there was an online survey created. Anyone who completed the survey was entered into a $250 draw. They had 300 responses, but found that since the survey was not well-designed, most of the information was not useful. The most useful information came from the open-response questions.
  • Second survey – a second survey was done to investigate the terms that should be used on the library website. This survey was done on paper and distributed evenly at the different library locations in order to ensure that the responses didn’t all come from the same group of users.
  • Card-sort protocol – this was done to decide how to organize the website categories before thinking about the actual design. From the structured analysis the team came up with 76 “essential links.” These links were put on cards and users were asked to organize the cards into meaningful groups and name the groups. UCLA used 40 participants in the card sort, although research suggests 15-20 or 30 is the ideal number. To recruit users, they offered gift bags. It took two weeks to find enough participants. Users were given one hour to sort the 76 cards. During the initial tests, they found that users used a wide variety of category titles; therefore, they needed to standardize the terms. To do that, they did a second card sort protocol.
  • Think aloud protocol – from the previous stages enough info was gathered to build a prototype website. UCLA decided to use ten participants in the think-aloud, even though research suggests that five users is enough. They used more than one participant from each user group, for example undergraduate students and graduate students. Users were given a list of tasks to complete on the website. Two facilitators were used, “one read the questions and interacted with the participant while the other recorded the participant’s actions, including: the path taken to find the answer; anything said while navigating the site; and any observations of the participant’s behavior” (p. 232). After the task, users were asked, “their general impression of the site, suggestions for the designer, or any other comments about the website. Finally, there was a brief survey about the participant’s previous use and knowledge of the library and the library’s website” (p. 232). No major changes were made to the site based on the think-aloud.
Following the think-aloud usability testing, the full website was developed. A logo was designed and a standardized page template was created to be used on all pages. Once the site was created, the team solicited feedback through emails to faculty, comments through a link on the new page, and library staff feedback sessions (p. 234). For any revision made to the site, a think-aloud protocol will be used.

References: 3, (2000-2004)

Key Ideas and their relation to our project:
  • Surveys need to be well-made in order for them to provide useful information. Based on this Case Study, the open-ended questions provide more valuable data than selected responses.
  • It may take a while to find participants for testing; therefore, we need to keep this in mind when planning our timeline for prototypes
  • Card-sorting could be used to help with categories or set names
  • Use a variety of users for the think-aloud. Make sure not all users are of the same type
  • Give users a list of specific tasks to complete while doing usability testing. For example, we could ask them to find a specific slide, find slides for a specific location, etc.


Sunday, January 22, 2012

Reading Report 2: What Drives Content Tagging

Reading Report 2: What Drives Content Tagging: The Case of Photos on Flickr

Citation: 
Nov, O., Naaman, M., & Ye, C. (2008). What drives content tagging: the case of photos on Flickr. CHI '08, (pp. 1097-1100). doi: 10.1145/1357054.1357225
Article can be found here: http://nguyendangbinh.org/Proceedings/CHI/2008/docs/p1097.pdf


Keywords: 
tagging, tags, Flickr, motivation, photo sharing, social presence

This article is about (main discussion):
This article summarizes a quantitative study that was done on the motivations for users on Flickr use tags on their photos. The article built on previous qualitative studies. The study looked for and found positive correlation in the following areas:

  • the more motivated the users are to organize and communicate photo content for themselves (Self), then the more unique tags they use
  • the more motivated the users are to organize and communicate photo content for Public, then the more unique tags they use
  • the more groups that the users are members of on Flickr, then the more unique tags they use
  • the more contacts that the users have on Flickr, then the more unique tags they use
The study also tested the hypothesis that there would be positive correlation between the users being more motivated the organize and communicate photo content to Family and Friends, then the more unique tags they use. There was not a positive correlation found in this case. Possible explanations were provided.

The author argues that:

  • The level of users’ Self motivation will be positively correlated with their number of tags.
  • The level of users’ Public motivation will be positively correlated with their number of tags.
  • The number of contacts a user has will be positively correlated with the user’s number of tags.
  • The number of groups in which a user is a member will be positively correlated with the user’s number of tags. (p. 1098)

The authors make the following statements or cites the following references on support of his/her argument:
"We found that the levels of the Self and Public motivations, as well as the social presence indicators and the number of photos, were positively correlated with tagging level. In other words, Hypotheses 1, 2, 4 and 5 were supported. For example, The Public motivation is significantly correlated with the tagging level, and explains 2.25% (.1502) of the variance in it." (p. 1099)

This positive correlation is best summarized in the following diagram from p. 1099:
The diagram shows a positive correlation between all factors except Family & Friends as the stated motivation for tagging photos.

In order to support the lack of positive correlation between the Stated Motivation of Family and Friends and  the number of unique tags, the authors referenced interviews conducted by Ames and Naaman (Ames & Naaman, 2007). "The authors  [Ames and Naaman] suggest that for the Family & Friends target of tagging, the Organization function was a relatively weak motivation; the stronger motivation stems from the Communication function (in other words, users added tags to describe images to family and friends, not to help them find images). The Communication function on Flickr is served by other means that pose an alternative to tagging (e.g., titles, captions, and sets). In addition, users may communicate about the photos to their friends via other, external means (e.g., email)." (pp. 1099-1100)

The authors conclude that:

  • "Enhancing users’ tagging (by encouraging the factors that give rise to it) may contribute to the success of such communities." (p. 1100)
  • The authors suggest that "it is advised that managers of collaborative content systems seeking to increase tagging activity [should] focus their communication and marketing efforts on those factors that have a strong impact on tagging level." (p. 1100)
  • Current content-sharing systems and newly developed ones should be "design[ed] ....in ways that maximize the opportunities for social presence, and expose the effects of joining groups and adding contacts." (p. 1100)
The authors feel that:
Given the growing use of tags, the designers of content-sharing systems need to understand what motivates users to tag and which motivations will increase tagging. The sustainability of these content-sharing systems will depend on that. (p. 1100)

References: 
13, 1968-2007



Ways to apply this to our work:
  • This article did not end up providing as much information as I would have liked about tagging. I will need to do further research on effective tagging
  • If we put the Baker Slides on a system with groups and contacts, then we should take time to find as many appropriate groups and contacts as possible for the SHFS to be associated with. This may help to drive external users to the content. It may also assist with using common or appropriate tags
  • It helped to give insight into the motivations for tagging, which we can use to help us choose tags