• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • My Account
  • Subscribe
  • Log In
Itemlive

Itemlive

North Shore news powered by The Daily Item

  • News
  • Sports
  • Opinion
  • Lifestyle
  • Police/Fire
  • Government
  • Obituaries
  • Archives
  • E-Edition
  • Help
This article was published 1 year(s) and 7 month(s) ago

LTE: MCAS scoring is misleading

To the editor

October 1, 2023 by To the editor

To the editor:

 

My letter is in reference to the front-page article “MCAS scores vary widely in local area” published in the Sept. 21 edition. In the article, scores along with percentage increases and decreases compared to the statewide averages were listed for Lynn, Lynnfield, Marblehead, Peabody, Swampscott, Saugus, and Nahant.

Historically, MCAS test results have been compared across the school districts of Massachusetts. That should never have happened due to the various discrepancies between the districts. And there is never any mention of these discrepancies when test results are reported for the general public to read. For example, what is the percentage in each district of English language learners?

I am a retired high school teacher who taught in a school where a good number of students enrolled had little to no formal education in their native countries, and English was NOT their first language. Yet, those students took the test — which is written in English — and their results were included in the district’s stats.

Also, in each district, what are the statistics on attendance? Or the percentage of homeless students? Or the percentage of students with food insecurity?

These are all valid issues that can impact the student’s score. They are also factors that need to be laid out for the reader but are not. I’m not blaming those who wrote the article, I’m blaming the Department of Education.

Add to those few discrepancies mentioned in the previous paragraph that MCAS is given to different groups of students annually yet the scores are compared. For example, the current 10th grade students (graduating Class of 2026) will take the test this year. Then, next year’s 10th grade students (graduating Class of 2027), a completely different group of students, will take the test. Their scores will be compared to the previous year’s results to see if there is any improvement.

How is that equal? What should happen is that the same student should be tracked to see if there is improvement.

Anyone who disagrees with me should become an educator and spend their career teaching in a few different districts. I was able to see some comparisons in my years teaching in Lynn, Salem, and West Newbury. It was an eye-opening experience.

 

Sincerely,

Virginia LeBlanc
Swampscott

  • To the editor
    To the editor

    View all posts

Related posts:

No related posts.

Primary Sidebar

Advertisement

RELATED POSTS:

No related posts.

Sponsored Content

What questions should I ask when choosing a health plan?

Building Customer Loyalty Through Personalized Shopping Experiences

Advertisement

Footer

About Us

  • About Us
  • Editorial Practices
  • Advertising and Sponsored Content

Reader Services

  • Subscribe
  • Manage Your Subscription
  • Activate Subscriber Account
  • Submit an Obituary
  • Submit a Classified Ad
  • Daily Item Photo Store
  • Submit A Tip
  • Contact
  • Terms and Conditions

Essex Media Group Publications

  • La Voz
  • Lynnfield Weekly News
  • Marblehead Weekly News
  • Peabody Weekly News
  • 01907 The Magazine
  • 01940 The Magazine
  • 01945 The Magazine
  • North Shore Golf Magazine

© 2025 Essex Media Group