Competency Based Education

The purpose of this post is to clarify some opinions I expressed during a Twitter conversation with @PhilOnEdTech, @harmonygritz, and @WGU about Competency Based Education. I have followed Phil for almost a year now and I am impressed with the depth of his knowledge on Ed Tech issues. In a blogosphere of evangelists, he can be a voice of reason and caution, for that, I greatly admire his work. I am a classroom teacher. I have been for the last 12 years. Prior to working in the classroom, I worked for an entertainment industry, technology R&D lab. I have great enthusiasm and zeal for incorporating technology into my classroom practices. My experience in running a 1:1 classroom was that my course failure rate dropped by fifty percent. I do not believe that other teachers would have the same experience. In fact, I might not even have the same experience with a different sample of students. Thus, my main point in responding to Phil is that education policymakers should proceed cautiously before rolling out CBE on a large scale. Of course, this will not happen because most education policymakers are politicians who want to be seen as innovators, working hard to solve problems in our nation’s schools.

How-Competency-Based-Education-is-Changing-Mainstream-Learning-Infographic-550x575

Students who develop strong, positive relationships with their teachers are more likely to engage in rigorous academic work. Further, students who experience their high school curriculum in a coherent, aligned, and interdisciplinary manner (i.e., work-based, or competency-based education models) have shown increased engagement and higher graduation rates. Bob Bain has articulated the dangers a fragmented curriculum poses to student engagement. It was with these factors in mind that I joined a group of educators in starting up an entertainment industry-themed pilot school (LAUSD’s in-house answer to charters) a little more than five years ago.

What we experienced was that many of our students did not “choose” our program for its work-based program merits. Students came into the program with varying levels of enthusiasm for entertainment industry careers. When I interviewed many of my chronically absent and failing students, I found they wanted careers in social work, psychology, and law enforcement. Many of them had been rejected by the larger career academies in the district and had come to us as their 2nd, 3rd, or 4th choice. Further, a rigorous, career-based, competency model can be even more difficult for low-income students to complete. Some researchers have theorized that urban, low-income students are exposed to greater levels of violence in their neighborhoods and grow up with symptoms of PTSD. I concur with this line of reasoning. Dealing with teenagers is always a balancing act. On the first day of state testing, the body of one of our female students was discovered, naked, shoved into a box, and dumped on the side of the freeway. She was 15 years old. The police held a news conference at 7 am in front of our school. Do you think our students found those tests relevant to their lives?

slow_proceed_with_caution_signs_t42278-ba

My point is that we should follow Phil’s advice when it comes to Ed Tech. There are no silver bullets. I think Mark Twain said something about lies, damn lies, and education statistics. Often, I see schools and districts making million-dollar decisions based on hope rather than validated outcomes. I’d like to see CBE tested rigorously and in a variety of school settings before it is anointed as the Holy Grail, be all, end all replacement to the Carnegie unit. I am in favor of experimentation in K12 education. I think, as a country, we do far too little of it. I am concerned that ed pols rush headfirst, full of optimism and good intentions, into areas where there is a dearth of empirical evidence, then are surprised when things don’t work out.  Many K12 students lack the work experience and subject matter knowledge to benefit from a competency-based ed program. Let’s take a breath and see how deep the water really is before we all jump in.

Advertisements

CAASPP HSS Meeting

I was invited to participate in a stakeholder meeting on the California Assessment of Student Performance and Progress (CAASPP). The purpose of the meeting is to gather input from stakeholders regarding a new History-Social Science Content Assessment. The new assessment will be aligned with the Common Core State Standards. The meeting is hosted by the California Department of Education (CDE) and Educational Testing Service (ETS).

California should move toward an accountability system with local designs in order to have schools prepare students for regional colleges and work sectors. Schools and districts can use state funds to create new assessments for college and career readiness. Testing vendors need to be more inclusive of local needs, or else the opt-out movement in standardized testing will increase. California’s Local Control Accountability Plan (LCAP) requires districts to develop, adopt, and annually update, with parent and community involvement, a three-year accountability plan that identifies goals and measures progress across multiple indicators of both opportunities and outcomes. Local districts can add their own indicators to those that are state required. SBAC

Research shows that engaging teachers in jointly scoring student work produces greater success, builds professional norms, and increases content knowledge. Further, engaging students in peer review requires them to compare their work to the standards and culminates in powerful learning experiences. The next generation assessments should incorporate both of these factors, but additional improvements in testing may be driven through the use of artificial intelligence, computer adaptive testing, and what are colloquially known as “robo-graders” to provide instant feedback to students during the testing process.

Now that California assessments are moving beyond paper and pencil, the technology behind testing systems needs to be updated also. This year, my free fantasy football league added a new feature. Each week, they use artificial intelligence to run if/then calculations on your individual team. If you had played Tom Brady instead of Peyton Manning, you would have come in first place instead of fifth. The computer reports on four or five of your key players and you kick yourself for the rest of the week. Spotlight-Parent-Teacher-ConferenceHonestly, I was blown away by how light, conversational, and even funny the writing in this narrative was, because I knew it was all computer-generated. It was very user-friendly, like the products that Spotlight Education develop.

Regardless of what the assessment experience is like, it’s going to produce some form of data, and that data should be understandable and actionable by families and educators. After pondering this, I started wondering why our educational leaders aren’t using this technology to make testing more interactive and interesting? Aren’t they talking to anyone in Silicon Valley? Don’t they realize that gamification offers powerful assessment technologies and engages students in deeper learning activities?  If these new tests are going to be worth the effort, shouldn’t we make them cutting-edge and restore California’s reputation as an innovator?

The next-generation assessments in History/Social Studies should:

  1. Give students some choice about what they feel they know the most about (e.g., The Gold Rush, WWI, The Vietnam War, and etc).
  2. Provide a library of documents and a menu of different performance tasks like the CWRA or New Hampshire’s PACE system.
  3. Measure how well students respond to feedback from a computer adaptive engine.

For example, students construct a response arguing whether the eugenics movement was positive or negative for mankind. The computer “reads” their response and gives them a revision list. Then, the students improve their writing and resubmit. A teacher, or subject area expert may still be needed to assess the level of content knowledge in the students answer, but we are nearing the time when computers can assess a student’s historical reasoning. This type of test would not only measure what the student knows, but what the student can do with what they know. This is similar to what happens in the workplace, when a piece of writing has to incorporate the diverse opinions of a team or committee. Computer adaptive tests

Critics of this approach will demand evidence that students have learned all the standards. Surely, measurement experts can design an approach that samples different populations in schools and gives them different standards, otherwise we will get new tests that are no different from the 60-80 question fill in the bubble tests that provide 2-4 questions per standard and cause testing fatigue in our students. Allowing students to select a portion of the content will create more engagement in the testing process. If technology can provide high-quality analysis and advice to fantasy football players for free, why can’t ETS provide similar rich and rewarding experiences when testing California students? We need tests that teach students to reflect on their choices and make better decisions. Further, these tests need to be engaging (and dare I say fun?) enough so that students look forward to the challenge instead of dreading it. We have the technology. We have the know-how. We just need the consensus. Please get involved with your local schools and provide input on their LCAP proposals. Demand more than a fill in the bubble testing experience for your child.

Sources

Darling-Hammond, L., Wilhoit, G., & Pittenger, L. (2013) Accountability for College and Career Readiness: Developing a New Paradigm. Standford Center for Opportunity Policy in Education. Palo Alto, CA. Accessed at http://dx.doi.org/10.14507/epaa.v22n86.2014 on March 15, 2015.

Leather, P.K. and Barry, V. M. (2014) New Hampshire Performance Assessment Of Competency Education: An Accountability Pilot Proposal To The United States Department Of Education. November 21, 2014. Concord, NH. Accessed at http://www.education.nh.gov/assessment-systems/pace.htm on March 12, 2015.

Smarter Balanced (2014) Smarter Balanced Pilot Automated Scoring Research Studies. http://www.smarterbalanced.org/wordpress/wp-content/uploads/2014/09/Pilot-Test-Automated-Scoring-Research-Studies.pdf

Virtual Schools Perform Poorly

For some time, I have been wrestling with the problems occurring in virtual schools throughout our country. As  a tech enthusiast, I believe educational technology has the potential to transform public education. However, the wrong students are being recruited into virtual schools and because almost all virtual schools are charter schools being run by for-profit Educational Management Organizations (EMOs) they are trying to maximize their dollars instead of improve their educational delivery model. Researcher Michael Barbour ( )thinks competition and market forces in the education system have created a separate, but equal education systems. While Adam Smith championed  free markets in the private sector, we have seen that public school closures devastate and devalue communities (Hello Chicago!). Thus, this post is a collection of tweets that will be sent to California legislators who oversee K12, Inc’s California Virtual Academies (CAVA) virtual schools. In the hopes that they remember their responsibility is to California school children, not out of state corporations. Feel free to blog, reblog, tweet, retweet, and rock on in the search for truth, justice, and the American way.

Computer Dimploma

California Virtual Academies #CAVA is the largest provider of virtual public education in CA. CAVA uses eleven locations to employ 766 teachers who work from home and educate students online. Darling-Hammond, et al (2014) found that the promise of ed tech has failed to meet the high expectations policymakers have heaped on the sector, however, there have been many successes that reveal promising approaches for technology implementation. Pollock et al, (2014) maintains that high-quality teacher assistance “seems to be mandatory for the online learning of underprivileged students.”

In 2011-12, the most recent data available, CAVA paid teachers an enrollment-weighted, system-wide average of $36,000 a year, while teachers at CAVA’s authorizing districts made an average of $60,000 a year. Because they only pay a fraction of what corresponding districts pay, CAVA teachers report high rates of teacher turnover. In 2012-13, CAVA received $95M in public funding, $47M went to K12 HQ in Virginia. In the last four years, CAVA’s overall graduation rate was 36%, compared to 78% for the state of CA. In 2012-13, 57% of schools with similar student populations performed better than CAVA and 71% of all schools in the state performed better than CAVA. CAVA’s statewide rank was a 2.9 out of 10. Some teachers spend 65 hours per week just completing administrative tasks. In 2012, K12 spent $1 million on Nickelodeon and Cartoon Network advertisements, and $600,000 on teen social media sites. That year the company’s ad spending topped $20 million.

Using the California Department of Education’s definition of “continuous enrollment,” CAVA was found to have a 2012-13 student turnover rate of 24%, compared to 7% in California. CAVA’s model of virtual education negatively impacts CA kids. Virtual Schools should not look like this.

CAVA’s head of school has issued this response: Response to In The Public Interest Report from California Virtual Academies, by Katrina Abston.

Sources

https://edpolicy.stanford.edu/sites/default/files/scope-pub-using-technology-report.pdf 

Pollock, M., et al. (2014). Innovating toward equity with online courses: Testing the optimal blend of in-person human supports with low-income you and teachers in California. The Center for Research on Educational Equity. University of California San Diego. La Jolla, CA. Accessed at http://create.ucsd.edu/research/CREATE%20Equity%20RR_1Mar2014.pdf

http://www.inthepublicinterest.org/article/virtual-public-education-california-study-student-performance-management-practices-and-overs

http://nepc.colorado.edu/publication/virtual-schools-annual-2015

www.labornotes.org/2015/01/virtual-teaching-real-organizing

If you feel inclined to contact your California Legislators about this issue, I am providing their Twitter handles below:

Senate Education Committee

@SenatorCarolLiu

@bobhuff99

@MartyBlock39

@SenatorLeyva

@MrTonyMendoza

@DrPanMD

@SenAndyVidak

Assembly Committee on Education

Legislative Office Building, 1020 N Street, Room 159 Sacramento, California 95814 Phone number (916) 319-2087.

Committee members Twitter handles are:

@ODonnellUpdate

@AsmRocky

@YKAssembly

@KMcCartyAD7

@TonyThurmond

@AsmShirleyWeber

SITE 2015 – Research Panel on Professional Development and Teacher Preparation for K-12 Online and Blended Settings

Thanks for summarizing my presentation, Michael.

Virtual School Meanderings

The twenty-second session, and the final one for day three of blogging, at the 2015 annual conference of the Society for Information Technology and Teacher Education related to K-12 online learning that I am blogging is:

Research Panel on Professional Development and Teacher Preparation for K-12 Online and Blended Settings

  1. Scott Petri, Los Angeles Unified School District, United States
  2. Keryn Pratt, University of Otago, New Zealand
  3. Susan Poyo, Franciscan University of Steubenville, United States
  4. Kathy McVey, Franciscan University of Steubenville, United States
  5. Mary Lucille Smith, Franciscan University of Steubenville, United States

Thursday, March 5 4:15-5:15 PM in Amazon HView on map

<Presentation: Paper #44226>
Amazon H Thursday, Mar 05 2015 04:15PM-05:15PM

This panel will bring together leading experts to explore the research related to teaching roles in K-12 online and blended classrooms. Scott Petri will discuss how MOOCs can be used as a mechanism for providing professional development…

View original post 602 more words

SITE Presentation Materials

SITE Logo I have been studying teacher innovation for the last five years. My research examines the confluence of teacher entrepreneurial orientation, blended learning, and online teacher professional development. What I have found is that these areas are converging in the so-called “MOOC-osphere.” This means there are great opportunities for leveraging and scaling MOOCs as assets in teacher professional development programs. We know from research (Barnett, 2002; Borko, 2004; Darling-Hammond et al, 2009; and Killeen, Monk, & Plecki, 2002) that teachers often view professional development as ineffective. Most PDs do not provide ongoing support for implementing new strategies or tools. MOOCs offer a scalable way to train staff anytime, anywhere and in very large groups. This approach produces robust data sets that illustrate which learning activities are effective and which are not. This data can be analyzed to fine-tune the variety of trainings essential for rolling out comprehensive curricula implementations, blended learning initiatives, and 1:1 programs. EO Dimensions The entrepreneurial orientation (EO) construct has been studied for 40 years and these studies have been published in 256 scholarly journals. Although primarily used in Management research, the construct has been successfully adapted and validated as a scale for measuring teachers and administrators along domains of innovativeness, proactiveness, and risk-taking. This work provides precise definitions for each domain as well as a baseline for comparing teachers who seek out PD opportunities online to those who do not. Dede et al (2005) reviewed 400 articles about online, face-to-face, and hybrid teacher PD programs and found 40 represented high-quality empirical research. They developed five areas for examining best practices (a) Design, (b) Effectiveness, (c) Technology, (d) Communication, and (e) Methods. These focus areas may provide a framework for evaluating MOOCs PD assets. As a final takeaway, I would like to clarify that I am NOT suggesting that we do away with all other forms of PD, however, Districts should be supplementing their professional development programs with MOOCs and using that data to drive their follow-up offerings. While for-profit corporations proliferate, marketing online education programs with dubious success rates, perhaps the smart play is to market MOOCs to people who want to be life-long learners, improve their technical skills, and increase their pedagogical moves.  These people are already in your buildings. They are your teachers.

References

Dede, C., Breit, L., Jass-Ketelhut, D., McCloskey, E., and Whitehouse, P. (2005). An overview of current findings from empirical research on online teacher professional development. Harvard Graduate School of Education. Cambridge, MA. November, 2005. Accessed at http://www.gse.harvard.edu/~uk/otpd/final_research_overview.pdf

Petri, S. M. (2013). Where are the risk takers? Using the entrepreneurial orientation construct to identify innovative and proactive teachers (Doctoral dissertation, California State University, Northridge). http://scholarworks.csun.edu/handle/10211.2/4464

Online Teacher PD

Title II funding has resulted in the allocation of more than three billion dollars to professional development (Darling-Hammond et al, 2009). More than 40 states have adopted standards calling for effective professional development. Yet, as a nation, we have failed to leverage these examples to ensure that every educator and every student benefits from highly effective professional learning.

PD in Learning

Blank & Alas (2009) reported that standards-based educational improvement requires teachers to have deep knowledge of their subject and the pedagogy that is most effective for teaching the subject. School districts spend the equivalent of $200/pupil on professional development (Killeen, Monk, & Plecki, 2002). Unfortunately, teachers often view professional development as ineffectual or a waste of their time. Many programs offer “fragmented, intellectually superficial” seminars (Borko, 2004, p 3). Worse, these PDs do not provide ongoing support for implementing new strategies or tools (Barnett, 2002). This makes it difficult for teachers to implement new practices in environments resistant to change.

Dede et al (2005) reviewed 400 articles about online, face-to-face, and hybrid teacher PD programs and found 40 that represented high-quality empirical research. They developed five areas for examining best practices (a) Design, (b) Effectiveness, (c) Technology, (d) Communication, and (e) Methods. These focus areas may provide a framework for evaluating MOOCs as Blended/Online Teacher Professional Development assets.

SITE Logo

If you are in Las Vegas, Nevada, please join me at the 2015 SITE Conference at the Rio Hotel. I will be discussing the benefits of leveraging and scaling MOOCs as teacher professional development assets at a research panel on Professional Development and Teacher Preparation for K-12 Online and Blended Settings on Thursday, March 5th, from 4:15-5:15 pm, in room #11.

Also joining me will be:

• Keryn Pratt, University of Otago, New Zealand
• Susan Poyo, Franciscan University of Steubenville, United States
• Kathy McVey, Franciscan University of Steubenville, United States
• Mary Lucille Smith, Franciscan University of Steubenville, United States
• Margie Johnson, University of Phoenix, United States
See you in Vegas, baby…

Increasing Class Discussions

A great body of research (Chapin et al, 2009; Daniels, 2002; Duff, 2002; Flynn, 2009; Mason, 1996; and Spiegel, 2005) has focused on classroom discussions. I began experimenting with this technique in January. I spent two weeks mapping and analyzing data from student-led discussions in five 9th and 10th grade World History classes (N=197). A sample of the students was divided into two categories “frequent participants” and “non-frequent participants.” Over the course of a semester, they would be given a series of assessments. This data would be analyzed to see if increasing classroom discussions has an effect on increasing student achievement.

Discussions

After two weeks, I asked students to conduct a self-assessment and give themselves a grade on their participation in the classroom discussions. I provided direct instruction and a handout (based on Teaching the Core) that helped identify seven ways to participate in classroom discussions. Students were asked to reflect on their participation and give themselves a grade between 1-10 on the quality of their work. The following 48 comments offer insight into what motivates students to participate, or not participate in class discussions.

  1. I did not participate in class discussion because I had not watched the lecture yesterday therefore I could not talk about it.
  2. I didn’t participate because I raised my hand, but I never got picked.
  3. During the class discussion, I was listening to what they said and I was taking notes. I was going to say something but I guess someone beat me by saying what I wanted to say first.
  4. I wasn’t able to participate in class because I went to the restroom and by the time I came back they were talking to the inner circle. But I had a lot to talk about. Sadly, I didn’t get a chance.
  5. I was not able to participate because I was having trouble understanding the discussion.
  6. I didn’t participate until I became the one to ask questions. I had notes but my notes weren’t very detailed and I couldn’t answer the questions.
  7. I did not participate because I am uncomfortable with public speaking.
  8. I didn’t participate in the class discussion because I always participate and I want others to participate too.
  9. I did not participate because I had no information and I did not watch the lecture notes.
  10. I didn’t participate in the discussion because I cannot talk I’ve been sick and I have a bad cold. Although I wasn’t talking I was listening.
  11. I did not participate in this discussion because I did not have anything to say as I was not prepared with information.
  12. I am able to detect bad reasoning and distorted evidence. Some people might read something that you know is not true and you look for the true statement to see if they are correct or wrong.
  13. I didn’t participate in the class discussion because first of all I did have information for some topics but the host would pick someone else with that information that I had.
  14. I did not participate because I have not watched the video last night. I went to go watch it and it wouldn’t load. It would open then shut down. I was not able to watch it or get the notes from someone else.
  15. I did qualify my own views. I got my peers closer to the answer even though I got some wrong.
  16. I participated, but I kind of winged it. I saw/put myself in the other countries shoes to see what I would do, so when I did that, it made more sense.
  17. I did justify my own ideas and make connections to the topic because I talked about Japan’s invasion.
  18. During our class discussion I wasn’t called on but I helped explain what Staffon was trying to say about the treaty of Versailles and Germany.
  19. In response to Ruben’s question, I answered about one of Hitler’s points in the book he wrote.
  20. I had some notes to use during the discussion. I asked questions to clarify or restated the questions. I didn’t really get others into it, but I let others have a chance to answer.
  21. I read the questions for the inner circle.
  22. I need to speak a little more loudly and explain more what I’m trying to say. I also add more when people state their question or opinion.
  23. I participated in the Socratic seminar by announcing the questions to the outer circle. I also helped the inner circle on a question. I gave my opinion as well.
  24. I didn’t really answer anything but it was a learning experience for the things I did not know.
  25. I always ask questions to shorten down the answer.
  26. I participated by talking about it and answering the questions. Daniela asked and I finished answering.
  27. I wouldn’t get any points because I didn’t participate or answer any questions.
  28. I didn’t participate at all, well I mentioned something about the big four but it wasn’t noticed.
  29. I didn’t participate in the class discussion I should try to participate. I didn’t refer to evidence. I didn’t make a conversation.
  30. I used my knowledge for other discussions to answer the questions. I just answered the questions I knew and went from there. I should get a 7 or 8 out of 10.
  31. According to the lecture, I got evidence from the video lecture. In other words, I listened to the video and what the teacher says. In response to arguments question, I took notes from what people say. I think you should pass one by one and ask us questions to the students. I think you should grade us as helping each other by answering your complete question
  32. I won’t be able to participate in this class discussion since I had computer problems though now I will be able to look at lectures 1 and 2 and take notes.
  33. In this class discussion, I distributed a lot in the topic of the Treaty of Versailles. I told the class the four major problems of the Versailles treaty. I incorporated Jamilet into the conversation. I feel like I was well-prepared. I feel like I deserve a B due to my cooperation.
  34. This class discussion I did okay. I feel like I could have participated more than I did. I think I deserve a C for this class discussion because I participated and answered questions. I know I can do better for our next class discussion.
  35. I refer to evidence from the text under discussion and/or the research pertaining to the subject by using the notes. I’ve written down from the lecture and the knowledge I had. I move conversations by posing and responding to questions that relate to the broader themes or larger ideas by adding on to someone else’s comment.
  36. According to the Lecture 01 video, stated in my notes the causes of the Treaty of Versailles. Germany was the greater downfall to the treaty; their ships were at the bottom of the ocean after not agreeing to give it away. Their air force and submarines were delayed.
  37. According to the video lecture, I referred to the Versailles Treaty and how the DMZ (de-militarized zone) at the west of Rhineland is one of the things that would be a reason to start WW11.
  38. I didn’t participate, but I did pay attention to what others were saying.
  39. I didn’t participate in this class discussion because I didn’t have enough time to watch the whole video.
  40. I participated two times to the discussion, but I think I could have participated more. In the next discussion I will have written better notes and try and answer most of the questions or at least some.
  41. In this class discussion, I honestly didn’t contribute as much as I should have. But it is only because I did not watch the lecture last night so I didn’t really understand. I did listen and kind of understand now. I will go home tonight and watch 1st and 2nd lecture and will come tomorrow ready.
  42. According to the discussion I used my notes to help respond. And take notes of discussion. I asked some questions and responded to some. I do not actively import.
  43. I participated enough times in the conversation. I rephrased Jazmin’s quote and I should be graded with a B+. I didn’t talk during the conversation and I didn’t distract anyone.
  44. I responded to the question about the Treaty of Versailles. I said that the German army was forbidden to have tanks, an air force, and submarines.
  45. I hardly contributed in today’s discussion because all of the questions I knew were for the outer circle.
  46. I think I did a poor job. I could have contributed more and stated more points. I will use the common starters tomorrow during the discussion. I will take better notes and make sure everything is good and correct.
  47. I didn’t contribute in the class discussion but I will take notes on Lecture 2. I got you Dr. Petri. I promise. And I won’t be on my phone tomorrow.
  48. I was not participating today. But I did listen to how others talked and brought their knowledge and clarified others information into the Socratic circle. Now, I have more of an idea how to incorporate my ideas into the discussion.

Students Discussing

More analysis to follow.