Increasing Class Discussions

A great body of research (Chapin et al, 2009; Daniels, 2002; Duff, 2002; Flynn, 2009; Mason, 1996; and Spiegel, 2005) has focused on classroom discussions. I began experimenting with this technique in January. I spent two weeks mapping and analyzing data from student-led discussions in five 9th and 10th grade World History classes (N=197). A sample of the students was divided into two categories “frequent participants” and “non-frequent participants.” Over the course of a semester, they would be given a series of assessments. This data would be analyzed to see if increasing classroom discussions has an effect on increasing student achievement.

Discussions

After two weeks, I asked students to conduct a self-assessment and give themselves a grade on their participation in the classroom discussions. I provided direct instruction and a handout (based on Teaching the Core) that helped identify seven ways to participate in classroom discussions. Students were asked to reflect on their participation and give themselves a grade between 1-10 on the quality of their work. The following 48 comments offer insight into what motivates students to participate, or not participate in class discussions.

  1. I did not participate in class discussion because I had not watched the lecture yesterday therefore I could not talk about it.
  2. I didn’t participate because I raised my hand, but I never got picked.
  3. During the class discussion, I was listening to what they said and I was taking notes. I was going to say something but I guess someone beat me by saying what I wanted to say first.
  4. I wasn’t able to participate in class because I went to the restroom and by the time I came back they were talking to the inner circle. But I had a lot to talk about. Sadly, I didn’t get a chance.
  5. I was not able to participate because I was having trouble understanding the discussion.
  6. I didn’t participate until I became the one to ask questions. I had notes but my notes weren’t very detailed and I couldn’t answer the questions.
  7. I did not participate because I am uncomfortable with public speaking.
  8. I didn’t participate in the class discussion because I always participate and I want others to participate too.
  9. I did not participate because I had no information and I did not watch the lecture notes.
  10. I didn’t participate in the discussion because I cannot talk I’ve been sick and I have a bad cold. Although I wasn’t talking I was listening.
  11. I did not participate in this discussion because I did not have anything to say as I was not prepared with information.
  12. I am able to detect bad reasoning and distorted evidence. Some people might read something that you know is not true and you look for the true statement to see if they are correct or wrong.
  13. I didn’t participate in the class discussion because first of all I did have information for some topics but the host would pick someone else with that information that I had.
  14. I did not participate because I have not watched the video last night. I went to go watch it and it wouldn’t load. It would open then shut down. I was not able to watch it or get the notes from someone else.
  15. I did qualify my own views. I got my peers closer to the answer even though I got some wrong.
  16. I participated, but I kind of winged it. I saw/put myself in the other countries shoes to see what I would do, so when I did that, it made more sense.
  17. I did justify my own ideas and make connections to the topic because I talked about Japan’s invasion.
  18. During our class discussion I wasn’t called on but I helped explain what Staffon was trying to say about the treaty of Versailles and Germany.
  19. In response to Ruben’s question, I answered about one of Hitler’s points in the book he wrote.
  20. I had some notes to use during the discussion. I asked questions to clarify or restated the questions. I didn’t really get others into it, but I let others have a chance to answer.
  21. I read the questions for the inner circle.
  22. I need to speak a little more loudly and explain more what I’m trying to say. I also add more when people state their question or opinion.
  23. I participated in the Socratic seminar by announcing the questions to the outer circle. I also helped the inner circle on a question. I gave my opinion as well.
  24. I didn’t really answer anything but it was a learning experience for the things I did not know.
  25. I always ask questions to shorten down the answer.
  26. I participated by talking about it and answering the questions. Daniela asked and I finished answering.
  27. I wouldn’t get any points because I didn’t participate or answer any questions.
  28. I didn’t participate at all, well I mentioned something about the big four but it wasn’t noticed.
  29. I didn’t participate in the class discussion I should try to participate. I didn’t refer to evidence. I didn’t make a conversation.
  30. I used my knowledge for other discussions to answer the questions. I just answered the questions I knew and went from there. I should get a 7 or 8 out of 10.
  31. According to the lecture, I got evidence from the video lecture. In other words, I listened to the video and what the teacher says. In response to arguments question, I took notes from what people say. I think you should pass one by one and ask us questions to the students. I think you should grade us as helping each other by answering your complete question
  32. I won’t be able to participate in this class discussion since I had computer problems though now I will be able to look at lectures 1 and 2 and take notes.
  33. In this class discussion, I distributed a lot in the topic of the Treaty of Versailles. I told the class the four major problems of the Versailles treaty. I incorporated Jamilet into the conversation. I feel like I was well-prepared. I feel like I deserve a B due to my cooperation.
  34. This class discussion I did okay. I feel like I could have participated more than I did. I think I deserve a C for this class discussion because I participated and answered questions. I know I can do better for our next class discussion.
  35. I refer to evidence from the text under discussion and/or the research pertaining to the subject by using the notes. I’ve written down from the lecture and the knowledge I had. I move conversations by posing and responding to questions that relate to the broader themes or larger ideas by adding on to someone else’s comment.
  36. According to the Lecture 01 video, stated in my notes the causes of the Treaty of Versailles. Germany was the greater downfall to the treaty; their ships were at the bottom of the ocean after not agreeing to give it away. Their air force and submarines were delayed.
  37. According to the video lecture, I referred to the Versailles Treaty and how the DMZ (de-militarized zone) at the west of Rhineland is one of the things that would be a reason to start WW11.
  38. I didn’t participate, but I did pay attention to what others were saying.
  39. I didn’t participate in this class discussion because I didn’t have enough time to watch the whole video.
  40. I participated two times to the discussion, but I think I could have participated more. In the next discussion I will have written better notes and try and answer most of the questions or at least some.
  41. In this class discussion, I honestly didn’t contribute as much as I should have. But it is only because I did not watch the lecture last night so I didn’t really understand. I did listen and kind of understand now. I will go home tonight and watch 1st and 2nd lecture and will come tomorrow ready.
  42. According to the discussion I used my notes to help respond. And take notes of discussion. I asked some questions and responded to some. I do not actively import.
  43. I participated enough times in the conversation. I rephrased Jazmin’s quote and I should be graded with a B+. I didn’t talk during the conversation and I didn’t distract anyone.
  44. I responded to the question about the Treaty of Versailles. I said that the German army was forbidden to have tanks, an air force, and submarines.
  45. I hardly contributed in today’s discussion because all of the questions I knew were for the outer circle.
  46. I think I did a poor job. I could have contributed more and stated more points. I will use the common starters tomorrow during the discussion. I will take better notes and make sure everything is good and correct.
  47. I didn’t contribute in the class discussion but I will take notes on Lecture 2. I got you Dr. Petri. I promise. And I won’t be on my phone tomorrow.
  48. I was not participating today. But I did listen to how others talked and brought their knowledge and clarified others information into the Socratic circle. Now, I have more of an idea how to incorporate my ideas into the discussion.

Students Discussing

More analysis to follow.

Creating School Data Simulations

“Using data to drive instruction” can be as difficult as determining who on your teaching staff is an “innovative educator.” Educational leaders understand the basic definition of each term, but when trying to clarify what is entailed in their everyday classroom practice, the definitions become slippery and harder to articulate. Every teacher believes they use data to drive instruction, but the real question is what data are they using? Are they using classroom data, school-wide data, district-wide data, state or nation-wide data. Who should make the decisions about which data to use?

I believe this knowledge gap can be conquered by creating simulations and trainings on the use of data in education. In order to do this, teachers must explicitly articulate hidden assumptions that they are reluctant to voice. A classic assumption in public schooling is that students need to be present in order to learn. While competency-based learning models are challenging this assumption, most school funding is predicated on average daily attendance. Therefore, educational leaders focus on improving attendance as an essential element in improving student achievement. This may not be an accurate assumption. The increase of blended learning calls into question the reliability of the Carnegie unit as “seat time” in a traditional brick and mortar school becomes increasingly irrelevant for self-motivated digital learners.

Creating a School-wide Data Simulation

GPA Data

Boudett et al (2005) suggest creating a graphical data overview and sharing this data with staff. This creates an inquiry process as educators endeavor to individually and collectively interpret graphs, tables and statistics. Examining the graph of school-wide GPA data above, reveals that grades at this school are distributed along a curve of normal distribution. This suggests that the instructional program is relatively sound. If there were a high number of 4.0 students or a high number of failing students, that might suggest that grades aren’t standards-based.

Students by GPA

A deeper analysis in the above figure shows exactly how many students need to improve and by how much. The school can use this information to develop a better understanding of which students need basic skills intervention and which students need additional motivation in completing school work.

Grade-Abs Correlation

After analyzing grades given by each teacher in a school, leadership can conduct a grade to attendance correlation study. This data, whether it is either by class or by school-wide GPA, can offer up powerful student achievement information and get staff to question how can a student missing 20 days of school still have a 3.5 GPA? Or even worse, how can students attend school every day and still fail almost every class? Are the students who miss 80 percent of the school year doing so because they are on the path to dropping out, or are they homeless or caring for a terminally ill relative. Numbers have power, but we have to remember that our students are individuals. Sometimes this type of analysis lets us start a conversation that may be crucial in reaching a disaffected student.

Grade Level Attend

A school should examine whether or not attendance is a predictor of a student’s grade point average (GPA). The above figure shows the relationship of attendance rate on GPA for an N=259 student sample. This correlation rate was .371, which is a weak relationship, suggesting that school attendance has a small effect on a student’s academic achievement. Since several educational researchers (Bridgeland, DiJulio, & Morison, 2006; Fisher, Frey, & Lapp, 2011) have suggested that attendance has a direct correlation on student achievement, this instructional program may be inconsistent in measuring student achievement.

Class Level Attend

As teachers struggle to comprehend this data, it may be beneficial to zoom in on one classroom’s attendance/grade correlation. The graph above shows an individual classroom with grades correlated with attendance rate. There are only 44 dots on the graph instead of 259, so the relationship is easier to spot. There are also fewer outliers, making these students easier to identify and provide intervention for.

James-Ward et al (2013) suggested asking broad questions to participants in data analysis like: What do we know from the data from our last school year? How does this information compare to prior years? Next, the participants can generate more specific questions that can be discussed in break out sessions. For example: Why did the 10th graders have the lowest ELA scores? What changed that increased our science scores so dramatically? How can we increase our attendance rate? Discussion on these questions can be used to create more specific goals and objectives for individual subjects and departments. A data team’s goal is to find changes in an instructional program, consider what caused them, then develop an action plan to improve instruction, implement it, and monitor the results.

Simulations can provide powerful epiphanies about the need to build a school-wide culture in using data to drive instruction. If only one or two teachers on a campus have strong relationships between attendance and grades, this may suggest that the school does not have a meaningful picture of its actual student achievement. If most of the school’s teachers have a strong correlation with attendance and grades, then perhaps their instructional program is more accurate in predicting actual student achievement.

References

Boudett, K., & Steele, J. (2007). Data wise in action: Stories of schools using data to improve teaching and learning. Cambridge, MA: Harvard Education Press.

James-Ward, C., Fisher, D., Frey, N., & Lapp, D. (2013). Using data to focus instructional improvement. Alexandria, VA: ASCD.

Stop Collaborate Listen

collaborate-and-listen

Data-driven decision making (DDDM) is defined as “the process by which an individual collects, examines, and interprets empirical evidence for the purpose of making a decision” (Mandinach & Jackson, 2012, p. 27). This definition can easily be broadened to encompass the collective efforts of a group of educators at one or more school sites. Mandinach & Honey (2008) suggested the use of student achievement and other data such as “attendance, course-taking patterns and grades, and demographic data” to drive school improvement at the school, district and state levels. Unfortunately, many school leaders across the nation are unsure about how to transform mountains of data on student achievement into an action plan that will improve instruction and increase student learning. Using a software product like Spotlight Education may help educators in gaining a better understanding test scores and growth models, as well as how to make and measure goals in their everyday classroom practices.

In order to deliver an authentic simulation that will demonstrate how important it is to build a data-driven culture at a school site, educational leaders first need to inspire teachers to collaborate and work together. Daniel Pink suggests changing this conversation from How to Why. When educators buy into and understand the Why they will figure out the How. Pink further defines a new set of ABC skills school leaders can use to motivate their teachers: Attunement – understanding someone else’s perspective; Buoyancy – remaining afloat in an ocean of rejection; and Clarity – curating, distilling, making sense of information. You can see these concepts illustrated in the video below.

Patrick Lencioni’s book The Five Dysfunctions of a Team lays out five core principles dedicated to the vision of improving teamwork in an organization. 1) Teams have to begin by building trust in each other and the leader. 2) Then you must have a team that can discuss conflict openly. 3) Everyone needs to be fully committed. 4) Everyone needs to accept accountability for their role. 5) Finally, there has to be an attention to getting results. Lencioni describes the power of teamwork as essential to any organization:

When it comes to helping people find fulfillment in their work, there is nothing more important than teamwork. It gives people a sense of connection and belonging, which ultimately makes them better parents, siblings, friends and neighbors. And so building better teams at work can – and usually does have an impact that goes far beyond the walls of your office or cubicle (pp. 4-5).

Lencioni acknowledges that teams may have a figurative leader, but members of a team each contribute to the success of the organization. In essence, they distribute or share power with each other. An effective educational leader can co-opt Lencioni’s techniques and create powerful simulations that give teachers opportunities to analyze data collectively. Collaborative groups harness the synergy of multiple abilities. An expert statistician may have trouble presenting data in an understandable way, but a middle school English teacher may excel at chunking complex concepts. Lastly, a group of people struggling to understand something that is important to their institution will bond over this endeavor and they will “own” the results of their labor. Meaning, once a team makes sense of their data, they will generate causal theories and hypotheses, which they can test and tweak and fine-tune to improve their school’s results.

References

Lencioni, P. (2002) The five dysfunctions of a team: A leadership fable. New York, NY. Jossey-Bass.

Mandinach, E., & Jackson, S. (2012). Transforming teaching and learning through data-driven decision making. Thousand Oaks: Corwin Press.

Pink, D.H. (2011). Drive: The surprising truth about what motivates us. Penguin Books.

Spotlight Education Makes DDDM Easy

Many scholars believe that teaching and learning has not improved nationally because teachers as a group have not learned how to use data effectively to improve student learning. This is chiefly a leadership problem, because many school principals lack the necessary skills to make decisions-based on data. Additional research has revealed that teachers also lack the clarity and tools to understand how their students, schools, districts and states are performing. Thus, it is clear that extensive training needs to take place so that educators and educational leaders can develop the skills to work with data and grow comfortable using data tools that increase student achievement.

Educational leaders need to make data more accessible and comprehensible for a workforce that has most likely avoided challenging coursework in statistics and mathematics. Getting these educators to embrace data-driven strategies may be a formidable challenge. For decades, businesses and government agencies have used cultural simulations to prepare executives for foreign travel, implement diversity programs, and ensure diplomacy at home and abroad. Few educational leaders have used these cultural simulations. Most will be surprised by the visceral epiphany they deliver. If similar simulations can be developed to ensure that all teachers become data-driven educational leaders, the U.S. education system could be poised to make a great leap forward.

Julia-Smith-with-Bubbles-e1418422091627

Gary Shirts, creator of the culture shock simulation BaFa’ BaFa’, outlines ten steps to successful simulations: 1) don’t confuse replication with simulation; 2) choose the right subject to simulate; 3) develop a design plan; 4) design the simulation, so trainees take responsibility for their actions; 5) use symbols to deal with emotionally charged ideas; 6) don’t play games with trainees; 7) use non-trainees to add realism; 8) develop an appropriate performance assessment model; 9) alpha test your simulation in low-risk circumstances; and 10) set your own standards for success.

Most teachers are not trained as quantitative statisticians and requiring them to work within the realm of data is analogous to requiring them to learn a foreign language. Simulations can ease the anxiety of this process by providing safe places for teachers to increase their understanding of student achievement data and how to systemically collect and analyze that data for continuous instructional improvement. When teachers gain data literacy skills and improve their understanding, the data analysis tasks can become more sophisticated. Solving a problem for someone else is often less worrisome than bringing up your own problem areas and having others comment on them. As teachers become more confident about their analysis and use of data, educational leaders will see teachers displaying class achievement data and clearly showing students where they have improved.

Spotlight Education provides a promising software service that simplifies educational data.  Instead of extensive lists of statistics with difficult-to-read tables and charts, their service creates easy-to-read, narratives in either written or video form. These reports offer insightful analyses of education data. They customize the reports for each stakeholder: teachers, students, parents, principals, or superintendents.

References

Boudett, K., & Steele, J. (2007). Data wise in action: Stories of schools using data to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Boudett, K., City, A., & Murnane, R. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Corrigan, M., Grove, D., & Vincent, P. (2011). Multi-dimensional education: A common sense approach to data-driven thinking. Thousand Oaks, Calif.: Corwin Press.

Mandinach, E., & Jackson, S. (2012). Transforming teaching and learning through data-driven decision making. Thousand Oaks: Corwin Press.

Mandinach, E., & Honey, M. (2008). Data-driven school improvement: Linking data and learning. New York: Teachers College Press.

Shirts, G. R. (1992). Ten secrets of successful simulations. Training, 29(10), 79-83.

Conducting Classroom Research

I always enjoy experimenting in my classroom. My students understand they are guinea pigs in my education laboratory and they look forward to hearing the results. Earlier this year we looked at the value of note-taking. Specifically, does the act of taking notes on a lecture increase student content knowledge as measured by a standardized test? I thought there would be an increase, but I didn’t know how measurable it would be.

I randomly spilt the class into four groups. Each group listened to a 15-minute audio lecture about Early Christianity and then they each took the same multiple-choice test. There were limits placed on each group. Group A was not allowed to take notes of any kind, they were only allowed to listen to the lecture. Group B was allowed to take notes, but they were not allowed to use the notes on the test. Group C was allowed to take notes and use their notes on the test. Group D was allowed to take notes, use them on the test, and they were also given a transcript of the lecture.

group-of-primary-children-watching-experiment-sml

With this population of students (N=184), the average score on the test was 71%, which was 10.7 questions correct out of 15 questions. Group A, not surprisingly had the lowest scores, with an average of 4.5 correct questions. Group B had an average of 9.8 correct questions. The results for Group C were surprising in that the students who were allowed to use their notes only scored an average of 10.5 questions correct. Finally, Group D scored an average of 12.1 questions correct. All groups had the same amount of time to complete the test.

This experiment allowed me to show my students that taking notes during an audio lecture results in an additional five correct questions on the test. It was interesting to note that there was not a large difference in scores when comparing the students who were not allowed to use their notes on the test to the students who were allowed to use their notes. It appears that the act of taking notes is enough to activate the brain in remembering content and referring to those notes may not have as significant advantages as previously assumed.

This semester, I am conducting an experiment that examines the effects of increasing the amount of student-led classroom discussion. Students receive content instruction via online lectures that they view for homework. This frees up class time for Socratic circles. The goal of these discussions is for students to arrive at a greater understanding of the material, not to assess who has viewed or understood the lectures.

class_in_circle_small_wide_pic

Unfortuantely, my 9th grade classes are very large.  An average of 39.6 students in my five classes result in a (N=198) sample. I arrange the class into two large circles, an inner circle and an outer circle. Each circle is given eight concepts to copy off the board. These are the primary points in the lectures. Students are allowed to use the notes they have taken at home and encouraged to add information during the discussion.

A student is chosen to “host” the discussion. They pose open-ended questions to the inner circle for 15 minutes and the outer circle for 15 minutes. At the conclusion of both sessions, students need to free write for ten minutes about what they learned from the discussion. During the conversations, I am charting the flow and noting how many contributions each student makes. I code positive comments with a plus (+) and negative or low value comments with a minus (-). After three days of observations, I categorized students into two groups “High Talkers” and “Low Talkers”. The research questions for this experiment are: (a) Do class discussions increase student content knowledge? (b) What are the effects of class discussions on (high talkers) students who actively participate in the discussions? (c) What are the effects of class discussions on (low talkers) students who minimally participate in the discussions?

I have a series of multiple-choice, short answer, and essay tests that I will be delivering to my students. I look forward to sharing the results.

Entering the Online School Marketplace

Recently, I sat across the table from a man who runs a $160M per year Charter Management Organization. He asked me I would do if I had a blank check to start a new virtual school. The challenge was electrifying. New research shows that at-risk students benefit the most from ed tech. As a traditional classroom teacher and administrator serving the at-risk population, I have long been fascinated with blended and online learning. As a 1:1 teacher, I experienced firsthand how blended learning cut my course failure rate by 50%. This could be the chance for me to put my money where my mouth is and make a commitment to online teaching.

Virtual School

Online or virtual schooling is rapidly increasing in US K-12 education. In 2012-13, thirty states had multi-district, fully online schools with enrollments of about 310,000 students, and twenty-six states have state virtual schools with over 740,000 course enrollments. Online course enrollments have doubled in four years. According to California Learning Resource Network’s eLearning Census of 1,810 districts and charters, 53% reported having students participate in virtual or blended learning and 21% stated they were planning to implement online or blended learning (Watson et al, 2014).

The CLRN census counted 174,632 virtual and blended students in 2013-2014, a 39% one-year increase. The virtual student population has not grown significantly since 2012, but the number of blended students has skyrocketed, increasing 49% since 2013 and 74% since 2012. The adoption of blended and online learning is expanding in both traditional public and charter schools and the number of students participating in eLearning at each type of school is rising steadily. The 2014 census found that 60% of charter schools embraced virtual and/or blended learning as compared with 48% of traditional districts. Traditional public school districts account for the majority (67%) of California’s blended learning population, while charter schools make up 82% of the virtual population. The blended learning population grew 49% this year and most of that growth happened in charters. Since 2012, blended learning has grown 43% in traditional districts while charters have experienced a 287% increase. An encouraging talking point is that 58% of districts and charters feel their virtual and blended programs have resulted in greater student engagement and increased course completion rates.

So, if I had a blank check and a boss willing to enter a competitive marketplace, the first thing I would do would be retain United Talent Agency’s Brand Studio to develop a brand and strategy. Larry Vincent is an old friend from Disney and has written two great books on brand management. He is a rock star in this area.

Next, I would engage John Watson & the Evergreen Education Group to identify curricular products and delivery systems that would enable a new online school to compete with Connections Learning, K12 Inc., and FLVS Global. Being a good educational leader is recognizing where to get help when you aren’t an expert. I am not an army of one. John’s group produces the Keeping Pace in Digital Education series and is the premier research group reporting on blended and online learning. I would give their advice some serious reflection before starting on this journey.

Online Enroll by Sub

Lastly, Florida Virtual School (FLVS) is considered the gold standard in online schools nationally. They started in 1997 with seven staff members and 77 students, then increased to 477 students the next year, and had 2,500 students by their third year. Currently, they have 411K part time students and over 50K full time students. The entire K-12 online education market consists of 740,000 online course enrollments (Chingos & Schwerdt, 2014). FLVS would be the first place I would start recruiting employees to carve out a niche in this market.

If you had a blank check how would you build and staff your dream online school? What blended or online learning models would you incorporate into your program? How could you do a better job than the dominant players in the market?

Panel Examines MOOCs as Teacher PD

LEVERAGING AND SCALING MOOCS AS ASSETS IN BLENDED/ONLINE
TEACHER PROFESSIONAL DEVELOPMENT PROGRAMS

Light Bulb

Accepted for the Research on Professional Development for Online/Blended Teaching panel at the 2015 Society for Information Technology and Teacher Education (SITE) conference in Las Vegas, NV, March 2-6, 2015.

Abstract

Killeen, Monk, and Plecki (2002) reported that school districts spend the equivalent of $200 per pupil on professional development (PD). Unfortunately, teachers often view professional development as ineffectual. Worse, most PDs do not provide ongoing support for implementing new strategies or tools (Barnett, 2002). MOOCs offer a scalable way to train staff anytime, anywhere and in very large groups. This cost-effective approach produces robust data sets that illustrate which learning activities are effective. This data can be analyzed to fine-tune the myriad of trainings essential for rolling out costly 1:1 implementations and blended learning initiatives.

Dede et al (2005) reviewed 400 articles about online, face-to-face, and hybrid teacher PD programs and found 40 represented high quality empirical research. They developed five areas for examining best practices (a) Design, (b) Effectiveness, (c) Technology, (d) Communication, and (e) Methods. These focus areas may provide a framework for evaluating MOOCs as Blended/Online Teacher Professional Development assets.

This panel discussion will present data and lessons learned from two Teacher Professional Development MOOCs (Improving Teacher-Student Relationships and Helping History Teachers Become Writing Teachers) conducted on the Canvas Network. The purpose will be to develop a subsequent study, modeled on Dede’s framework, which will measure the satisfaction and efficacy of teachers participating in MOOCs as professional development.

RT to Help At-Risk Students

Linda Darling-Hammond, Molly B. Zielezinski, and Shelley Goldman just published Using Technology to Support At-Risk Students’ Learning. This thoughtful review of more than seventy recent studies found that educators using technology to improve student learning have had mixed results. Often, the promise of ed tech has failed to meet the high expectations policymakers have heaped on the sector, however, there have been many successes and the authors sought to reveal promising approaches for technology implementation.

SCOPE

In one study, several 9th-grade English classrooms were given technology to practice skills and create new content. Those classrooms outperformed advanced placement sections that studied the same material without technology. The teacher reported that technology allowed for more active instruction that could be differentiated to meet the needs of individuals and that students wanted to be a part of that personalized and active environment.

The following bullets may be retweeted without crediting me. Feel free to hashtag them #SCOPE, or #AtRisk. They came from a Twitter barrage that summarized and highlighted the sections of the report that resonated with me as an ed tech enthusiast, who has worked with large numbers of at-risk students over the last 11 years. These students deserve access to effective education technology. Getting these tips out to policymakers and education thought leaders is one way to make that happen.

  1. 16M Students live below the poverty line. 8M get free lunch. Children in poverty are 50% of US students.
  2. Many schools serve 100s, or 1,000s of users with the same bandwidth as a single home user.
  3. 30% of households don’t have high-speed broadband. Slow connection rates are in nonwhite and low-income households
  4. Research shows that if at-risk students gain access to technology, they can make substantial gains in learning
  5. Drill and practice activities in low SES schools tend to be ineffective.
  6. Uses of tech are disproportionate in high-SES schools where they achieve positive results.
  7. A benefit of well-designed interactive programs is they allow students to see concepts from multiple perspectives.
  8. Students learn more when they use technology to create new content themselves rather than receiving of content designed by others.
  9. Researchers found that 1:1 availability is important for lower-income students’ ability to gain fluency.
  10. Teacher assistance seems to be mandatory for the online learning of underprivileged students.
  11. Reports of the flipped classroom are generally positive overall. Students prefer interactive classroom activities
  12. College students in flipped classrooms are more likely to watch video lectures than to complete readings
  13. Tech policy should aim for 1:1. At-risk students benefit from opportunities to learn that include 1:1 access to devices.
  14. Technology access policies should ensure that speedy internet connections are available.
  15. At-risk students benefit from technology that promotes high levels of interactivity with data & info in multiple forms
  16. Coupled with PBL & support for teachers, digital learning can shift school culture & strengthen 21st century skills
  17. Plan for blended learning environments to have high levels of teacher support & interaction between students
  18. Instructional plans should enable at-risk students to use technology to create content as well as learn material.

This report complements the findings of Pollock et al (2014) who characterized the types of support teachers should provide in hybrid classrooms. The SCOPE report reiterates that replacing teachers with technology will not be a successful formula and that teacher assistance “seems to be mandatory for the online learning of underprivileged students.” This report was released by the Alliance for Excellent Education and Stanford Center for Opportunity Policy in Education (SCOPE). A copy of the full report is available here: https://edpolicy.stanford.edu/sites/default/files/scope-pub-using-technology-report.pdf

20 Qs for BL Networks

Joe Ableidinger recently authored an interesting thought paper that provides snapshots of blended learning networks. Building blended learning networks may help implementation in traditional school models. Educators could pilot test instructional models and collaborate on solutions to scaling up problems. Networks may constructively critique each other’s ideas and foster connections that will help grow programs.

The paper created five categories for 20 key questions: (a) Desired outcomes. What will the network ideally do or create? (b) Recruitment, screening, and selection. Who should be in the network? (c) Training and support. What will the network provide to its members? (d) External partners. Which outside experts should be involved, and in what ways? (e) The pioneering cohort. How should the network get started?

BL Models

Creators of blended learning networks will need to answer the questions below. The answers to these will shape the character of the network and the ultimate effects of the network over time. Research has not given us the correct answers. The “correct” answers to these questions will depend on the willingness of education leaders to meaningfully implement blended learning.

  1. What are you attempting to introduce that does not already exist, and what impact do you hope to achieve?
  2. What are the metrics by which you will judge your success in creating a strong blended learning network?
  3. Of what value to the city’s schools, teachers, and students is having a vibrant education technology ecosystem?
  4. Do you want to develop or network creators or users of education technology (or both)? What are the metrics by which your success in creating a strong education technology ecosystem will be judged?
  5. Will your network aim to connect creators with users?
  6. Will you bring together innovators or innovative ideas?
  7. Do potential participants self-select into the applicant pool, or is the pool pre-selected by network organizers or created through a nomination process?
  8. What criteria should you use to vet prospective network participants? What questions should you ask as part of the application or selection process?
  9. If you are planning to create a network of innovators, should you focus on individuals or teams?
  10. To create a network of proponents of innovative ideas, what stage of development should you target?
  11. Should the network focus on blended learning at the whole-school level, at the classroom level?
  12. Should the network be limited to particular grades and/or subjects?
  13. Should the network be limited to certain geographies?
  14. Should the network be limited to educators, or open to innovators?
  15. What types of training and support should the network provide?
  16. Should network activities be loose, or prescriptive?
  17. What are the best roles for external experts to play in supporting the network?
  18. What structure do you want to create for mentoring relationships in your blended learning network?
  19. Should you gather the best available candidates to pilot the network, or work with a preselected group?
  20. How much should the first cohort be about “getting it right,” versus serving as a test case for future iterations?

Unfortunately, most brick and mortar schools have not leveraged blended learning techniques that may turn students’ online time into increased instructional time. Ableidinger’s thoughtful work may provide those tasked with bringing blended learning to the masses a framework to consider before setting up field tests.

Blended-Learning-Logo

Hello Fellow MOOCers

Hello Fellow MOOCers,

Welcome to Improving Teacher-Student Relationships. We are excited that over 800 of you have decided to join us for this course. We have spent the summer curating resources for this MOOC. It has been a wonderful intellectual journey for us. We hope it becomes a rewarding endeavor for you and helps you improve the relationships in your classroom.

While Mr. Thomas is an experienced online educator, I have been more of a traditional classroom teacher throughout my career. Last year, I became a 1:1 iPad educator, which not only helped me lower my class fail rate by 50%, but also helped me forge deeper relationships with my students.

I have used student surveys for years. We will talk more about them in Week Three. Classroom surveys have definitely helped me learn what is important to my students. Last year, working in a 1:1 environment also helped me personalize education and connect with more students. For the first five weeks of the semester, I had a student who I will call Valerie (not her real name). I knew she was very smart, but she wasn’t engaged in class, wasn’t completing her classwork, and wasn’t doing any of my homework. She was painfully shy and would not interact with me or the other students.

Life Preserver

After a survey revealed that approximately 90 percent of my students had internet access at home, I started offering extra-credit assignments online. To my surprise, Valerie became my early adopter. She started posting to the class discussion boards even though she would never contribute to discussions in class. She began participating in virtual field trips and engaged in every online assignment that I offered. Within five weeks, her grade had gone from an F to a C and it was rising each week.

I began talking to her about comments she had made online. It was evident that she understood the World History readings and could connect the materials to modern day problems. One day, I lobbed a softball her way by asking her a question she had answered on the discussion board. One of my particularly domineering student’s head swiveled in astonishment as Valerie completed her answer. “You know how to talk and you know what you are talking about,” he gasped. I saw a little flash of a smile slide across her face and she never looked back. Although, Valerie never blossomed into what I would call an extrovert, she began actively participating in most class activities and finished the year with the highest grade in the class.

As I reflected on the year, I realized that if it weren’t for the 1:1 environment, I probably would have written Valerie off and she most likely would have failed my course. I was only able to reach her via online methods. After we built a virtual relationship, she felt comfortable enough to establish a real relationship. This school year, I was pleased to see Valerie hanging out on the quad with another student from our class. She has turned her high school life around.

Happy Grad

Teaching can be such a rewarding profession, but it is not easy. Sometimes students make it near impossible to build positive relationships with them. I know I am frequently haunted by thoughts of the students that I have been unable to reach. We all know what life is like for high school dropouts. Educators save lives one at a time. I hope this course helps you find and connect with the Valeries in your classes. I look forward to spending the next six weeks with you.