Skip to main content

McREL
Blog

Our expert researchers, evaluators, and veteran educators synthesize information gleaned from our research and blend it with best practices gathered from schools and districts around the world to bring you insightful and practical ideas that support changing the odds of success for you and your students. By aligning practice with research, we mix professional wisdom with real world experience to bring you unexpectedly insightful and uncommonly practical ideas that offer ways to build student resiliency, close achievement gaps, implement retention strategies, prioritize improvement initiatives, build staff motivation, and interpret data and understand its impact.

Three simple steps for adapting lessons for English language learners

By Blog, Classroom Instruction that Works, Research Insights 12 Comments

At some time or another, we’ve all probably all had the frustrating experience of trying to converse with someone in a foreign language. We may catch snippets and phrases here and there, generally getting the gist of what’s being said to us. But when it comes time to open our mouths and speak, we’re tongue tied. Our vocabulary (“I don’t know how to say sprint; I’ll just say run instead”), conjugation (“Let’s see … what’s the third-person plural of correr?”), and dialect fail us (“Shoot. I still can’t roll my r’s.).

At this point, a fear begins to grip us. We worry that our conversation partners may judge by our underdeveloped language skills that our intellect is similarly on par with a toddler.

That judgment would be wrong, of course. Just because we’re not yet conversant in a second language, doesn’t mean we can’t grasp difficult concepts.

Teachers face a similar challenge when they have English language learners in their classrooms. While those students’ language skills may not yet be fully developed, they are still very much capable of grasping complex content. And all learning can’t stop while students acquire their language skills—if it does, students’ will find themselves well behind their peers once their language acquisition catches up to their intellect.

In a new article for TeachHub.com, Jane Hill and Cynthia Bjork, authors of training-of-trainer materials that support the book, Classroom Instruction that Works for English Language Learners, identify three simple steps for adapting lessons for English language learners students to ensure they continue learning content while learning English:

Step 1: Know the stages of second-language acquisition. All learners progress through five stages of language acquisition, write Hill and Bjork: pre-production, early production, speech emergence, intermediate fluency, and advanced fluency. Knowing where their students are in this continuum helps teachers to plan their assignments accordingly.

Step. 2: Tier student thinking across the stages of second-language acquisition. Hill and Bjork’s article provides a matrix that overlays the levels of thinking from Bloom’s (new) Taxonomy to the stages of language acquisition. They recommend teachers use this matrix to plan homework and class assignments accordingly.

Step 3. Set expectations. In this stage, teachers combine stage one and two, identifying their students’ level of language acquisition and assignment at-home and in-class practice assignments accordingly.

By knowing where students are on this continuum and scaffolding their learning to the next stage of the continuum, teachers can ensure their students are gaining valuable knowledge while at the same time advancing in their language acquisition.

Read the whole article here.

Follow these links to learn more about McREL’s English language learners instructional leadership academies and books.

Bryan Goodwin is McREL’s Vice President of Communications & Marketing.

How to make staff development stick

By Blog, Classroom Instruction that Works, Leadership Insights 10 Comments

Educators have probably all grown wary of drive-by staff development—the one- or two-day workshop that momentarily energizes staff, getting everyone excited about doing something new, but then, like a photograph left too long in the sun, fades over time.

So who are we to blame when this happens? Teachers? Is it their fault when guidance from a workshop doesn’t take root in classrooms?

Not so fast, according to McREL staff members Jane Hill and Anne Lundquist in an article that’s now online at the Education.com site.

School leaders actually hold the keys to making staff development stick.

Hill and Lundquist lay out several strategies that they have used effectively in the English Language Learner Instructional Leadership Academies they have led in Colorado, Nebraska, Virginia, Iowa and other states to turn drive-by workshops into something lasting in schools.

These strategies include identifying, up front, a leadership team, consisting of school administrators, district staff, and teachers, who take responsibility for helping teachers to implement what they learn in staff development sessions in their classrooms.

Leaders also need to recognize that any change worth making is difficult and takes people out of their comfort zones. To loosen folded arms or “this too shall pass attitudes,” leaders must work on getting everyone on the same page (something at McREL we call creating a “purposeful community”) so they see the need for change and believe doing something different will make a difference. They also need to take steps to overcome the anxiety and pushback that comes with any difficult, meaningful change (which we call “second order” change).

Hill and Lundquist’s article offers practical steps for how leadership teams can accomplish both of these objectives. While their article focuses on staff development related to improving the achievement of English language learners, their practical tips and advice for making professional development stick translates well to all kinds of teacher learning.

Read the entire article online.

Bryan Goodwin is McREL’s Vice President of Communications.

Choice is a matter of degree

By Blog, Research Insights One Comment

Everyone likes choices, right? For years, Burger King has enticed us with offers to “have it your way.” Our cable and satellite services give us hundreds of channels (most of which we never watch). And online retailers, like Vans, now let us custom-design our own shoes.

If choices make us happier customers, shouldn’t we also give kids lots of choices about what they learn … to make them happier and more motivated learners?

Not necessarily. As I discuss in a column in this month’s issue of Educational Leadership, giving students some choices about what to learn (for example, choosing between books or reading passages) can motivate them, but giving too many choices can backfire.

As researchers Sheena Iyengar and Mark Lepper discovered in an experiment with college students, too many choices actually creates a sort of deer-in-headlights effect for students. Iyengar and Lepper found that when they gave students only six choices of topics on which to write an extra credit essay, they were far more likely to do the assignment (and do it well) than students given the choice of 30 possible topics. It seems the mental strain of choosing the best topic (and perhaps ensuing self-doubt over whether they had chosen correctly) caused the students with too many choices to invest less energy in the assignment or to simply abandon it altogether.

The bottom line appears to be this: choices for students are good, but as with all things, they should be doled out in moderation.

Read the full article to learn more about what the research says on student choice and project-based learning.

Bryan Goodwin is McREL’s Vice President of Communications and Marketing.

A more measured approach to measuring school performance

By Blog, Leadership Insights, Research Insights No Comments

In an op-ed piece appearing in the August 25 issue of Education Week, Douglas Reeves, chairman of The Leadership and Learning Center, and McREL President and CEO Tim Waters liken current education accountability efforts to judging a person’s health based solely on weight.

“For almost a decade, the complex enterprise of education has been reduced to box scores,” they write. “Good schools have high scores, bad schools have low scores.”

Had Michelle Obama’s campaign to fight obesity taken a similarly superficial approach, she might have just called for an annual weigh-in of every child, shaming and blaming those who tipped the scales to unacceptable levels. That would certainly cause kids to shed a few pounds—but also create a new generation of eating disorders and diet pill abusers. Fortunately, the first lady called for a more thoughtful path, calling for Americans to help their children to eat healthier food, exercise more regularly, and monitor their health more frequently.

Yet in education, we continue to operate under a “testing=learning” formula, which Reeves and Waters note, “is as superficial as the formula that ‘health=weight.’”

“If we want to avoid the educational equivalent of anorexia and pill-popping—teaching focused only on test content and test-taking strategy—then the accountability equation must include causes, not merely effects,” they write. “The accountability equation should be ‘learning=teaching+leadership.’ And an effective accountability system would measure all three elements of that equation.”

With Congress set to take up the Elementary and Secondary Education Act (the key piece of federal education funding legislation) this year or next, education accountability appears to be at a crossroads, write Reeves and Waters. We have the opportunity to either move forward with more thoughtful and sophisticated systems of monitoring progress or cling to outdated, simplistic, and harmful approaches.

Read the entire article— which offers several considerations for policymakers—here (Ed Week subscription required).

Bryan Goodwin is McREL’s vice president of communications and marketing.

Suprising insights into rural student mobility

By Blog, Research Insights No Comments

Hollywood movies often paint small town and rural America as idyllic, tight-knit communities where few people move in or out, and a new face in town—like Kevin Bacon’s surly mug in the film Footloose—is enough to really get folks talking. If your impressions of rural life are based on 80s movie musicals, though, it’s probably time to update those perceptions.

A new study conducted by Andrea Beesley for the REL Central—the federally funded education laboratory that McREL administers for the federal Institute of Education Sciences—has found that rural areas are actually places of high mobility, often higher than suburban or urban areas.

Research shows that, for a variety of reasons, highly mobile students tend to have lower achievement, higher dropout rates and be the subject of more frequent disciplinary action. To make matters worse, rural schools often have smaller staffs and fewer financial resources, making it difficult for them to meet the needs of highly mobile students.

The study, which calculated student mobility percentages by locale (city, suburb, town, and rural) in five states (Colorado, Missouri, Nebraska, North Dakota, and Wyoming) found that:

  • Districts with extremely high student mobility were often rural, had higher than average shares of students eligible for free or reduced-price lunch, and were on or near American Indian reservations.
  • In Wyoming, rural locales had higher mobility than town or city locales.
  • In North Dakota, mobility rates were higher in both towns and rural areas than in cities and suburbs.

Download the (free) report here.
Read Education Week reporter Sarah Sparks’ take on the report here.

Bryan Goodwin is McREL’s Vice President of Communications.

Technology Literacy Assessment

By Blog, Technology in Schools No Comments

Recently McREL assisted Wyoming’s Department of Education in determining how their districts and other states
assess the technology literacy of 8th graders as required by No Child Left Behind. The requirement to assess technology literacy does not specify how or by what criteria. Therefore, states are all defining it in different ways and using various assessment instruments. Some states have put a lot of effort into the assessment while others have given it little attention. Below is a summary of what we found. We know it is not a complete picture. What have we left off? Does your state have a different method of technology literacy assessment? Are there errors below? We welcome your comments.

State/Organization: Colorado

Link to Criteria: Levels were not found. Grade band profiles are found here.

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: TLAP

Strengths: Recently pilot tested and revised. Free to Colorado districts.

Weaknesses: Grant funded for Colorado only.

State/Organization: North Dakota

Link to Criteria: Unknown – you have to purchase the assessments to get the rubrics.

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: Atomic Learning – Tech Skills Student Assessment

Strengths: Focuses on how to use technology and how to apply it and allows easy identification of areas of greatest
instructional need. Includes customizable curriculum projects to target technology gaps.

Weaknesses: Unknown

State/Organization: Montana-based but used nationwide

Link to Criteria: Unknown

Standards Basis: 2002 ISTE NETS-S

Assessment Instrument: TAGLIT

Strengths: Includes assessments for administrators and teacher as well as students.

Weaknesses: Needs updating to the newest NETS-S.

State/Organization: South Dakota, Arizona, South Carolina, Georgia, Wisconsin, and other states

Link to Criteria: Technology skill set for 5th grade and 8th grade

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: Learning.com

Strengths: NETS-Aligned Resource. Blend of interactive, performance-based questions and multiple choice, knowledge-based questions to measure and report technology literacy and skills for elementary and middle
school students.

Weaknesses: Does not seem to support portfolio assessments.

State/Organization: New York (south central)

Link to Criteria: Unknown

Standards Basis: Unknown

Assessment Instrument: Tech Literacy

Strengths: This is a good example of what a Regional Education Service Center can accomplish.

Weaknesses: Small in scope with little background information.

State/Organization: Florida

Link to Criteria: Criteria found here.

Standards Basis: 2007 ISTE NETS-S modified for Florida.

Assessment Instrument: Student Tools for Technology Literacy

Strengths: After extensive feedback, indicators were modified. In April 2008, the complete tool was field
tested with over 1300 8th graders in several representative districts resulting in minor revisions prior to its availability statewide.

Weaknesses: Unknown – Florida only

State/Organization: Washington

Link to Criteria: Tiers of 8th Grade Technology Literacy Indicators

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: Washington Assessments for Education Technology

Strengths: Project based and integrated across content areas.

Weaknesses: Only social studies and the arts at this time.

State/Organization: North Carolina

Link to Criteria: Little found. Example report with some criteria found here.

Standards Basis: 2004 Computer/ Technology Skills North Carolina Standard Course of Study

Assessment Instrument: Test of Computer Skills

Strengths: Strong development process.

Weaknesses: Does not seem to incorporate project learning.

State/Organization: New Jersey

Link to Criteria: NJTAP-IN Rubric

Standards Basis: New Jersey Educational Technology Standards 8.1.

Assessment Instrument: No specific instrument has been identified, but the state has issued an RFI for one.

Strengths: Integrated with state planning and support structures found here.

Weaknesses: No specific instrument has been identified.

Other assessment sources used by schools:

State/Organization: InfoSource Learning

Link to Criteria: Unknown

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: Simple Assessments

Strengths: Used in over 1,200 districts nationwide. Free and easy to use.

Weaknesses: Seems oversimplified.

State/Organization: Intel

Link to Criteria: Each project has a rubric on the specific project guide page. An example can be found here.

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: Technology Literacy

Strengths: NETS-Aligned Resource.

Weaknesses: Unknown

State/Organization: State Educational Technology

Directors Association (SETDA)

Link to Criteria: Framework for Assessment of Technology Literacy

Standards Basis: 2007 ISTE NETS-S

Assessment Instrument: No specific recommendation. Analysis can be found here.

Strengths: A larger group of stakeholders from multiple states designed the toolkit.

Weaknesses: Based on older NETS-S standards.

State/Organization: National Assessment of Educational Progress (NAEP)

Link to Criteria: 2014 NAEP Technology and Engineering Literacy Framework – Pre-Publication Edition

Standards Basis: NAEP Standards developed by four cooperating organizations.

Assessment Instrument: NAEP Technology and Engineering Literacy Assessment (TELA)

Strengths: Large effort by national experts with collaboration from the International Society for Technology
in Education (ISTE)
International Technology and Engineering Educators Association (ITEEA), Partnership for 21st Century Skills (P21), and the State Educational Technology Directors Association (SETDA).

Weaknesses: Incorporates Engineering from STEM. (Some would consider this a strength)

RESEARCH NOTES:

In a report entitled Tech Tally: Approaches to Assessing Technological Literacy (Gamire & Pearson, 2006) it was determined that “doing” is central to students gaining technological literacy, traditional assessments will not work; technological literacy must be assessed in ways that are more authentic. According to the State Educational Technology Directors Association (SETDA), a knowledge-based assessment is insufficient on its own. If such an assessment is used, it should be used as a base in combination with a performance-based, portfolio-based or project-based assessment. The report developed six principles for guiding the development of assessments of technological literacy:

  1. Assessments should be
    designed with a clear purpose in mind.
  2. Assessment developers
    should take into account research findings related to how children and adults
    learn, including how they learn about technology.
  3. The content of an
    assessment should be based on rigorously developed learning standards.
  4. Assessments should
    provide information about all three dimensions of technological literacy—
    knowledge, capabilities, and critical thinking and decision making.
  5. Assessments should not
    reflect gender, culture, or socioeconomic bias.
  6. Assessments should be
    accessible to people with mental or physical disabilities.

Educational Technology Standards for Administrators (NETS-A)

By Blog No Comments

McREL recently presented at the International Society for Technology in Education (ISTE) Conference in Denver, CO. It appeared that not many attendees knew about ISTE’s newest National Educational Technology Standards for Administrators (NETS-A).  Take a look at them below and tell us where you think leaders in your schools are strong and where more focus in needed in a comment to this posting. The NETS-A include:

  1. Visionary Leadership. Educational Administrators inspire and lead development and implementation of a shared vision for comprehensive integration of technology to promote excellence and support transformation throughout the organization.
  2. Digital-Age Learning Culture. Educational Administrators create, promote, and sustain a dynamic, digital-age learning culture that provides a rigorous, relevant, and engaging education for all students.
  3. Excellence in Professional Practice. Educational Administrators promote an environment of professional learning and innovation that empowers educators to enhance student learning through the infusion of contemporary technologies and digital resources.
  4. Systemic Improvement. Educational Administrators provide digital-age leadership and management to continuously improve the organization through the effective use of information and technology resources.
  5. Digital Citizenship. Educational Administrators model and facilitate understanding of social, ethical, and legal issues and responsibilities related to an evolving digital culture.

You can find the detailed version of NETS-A here.