The Transparency Files: CAT*4 Results Part 2 (of 3)

Welcome to “Part II” of our analysis of this year’s CAT*4 results!

In last week’s post, we provided a lot of background context and shared out the simple results of how we did this year.  Here, in our second post, we are now able to begin sharing comparative data, focusing on snapshots of the same cohort (the same children) over time.  It is complicated because of three factors:

  • We only began taking the CAT*4 at this window of time in 2019 in Grades 3-8.
  • We did NOT take the CAT*4 in 2020 due to COVID.
  • We only took the CAT*4 in Grades 5-8 in 2021.
  • We resumed taking the CAT*4 in Grades 3-8 in 2022.

This means that there are only five cohorts that have comparative data – this year’s Grades 4-8.  And only two of those cohorts have comparative data beyond two years – this year’s Grades 7-8.  It is hard to analyze trends with without multiple years of data, but we’ll share what we can.

Here is a little analysis that will apply to all five snapshots:

  • Remember that any score that is two grades above ending in “.9” represents the max score, like getting a “6.9” for Grade 5.
  • Bear in mind, that the metric we are normally looking at when it comes to comparing a cohort over time is whether or not we see at least one full year’s growth (on average) each year – here we are factoring an expected two full year’s growth between 2019 and 2021.  [Feel free to refer to prior years’ results for specific analyses of both “COVID Gaps” and “COVID Catch-Ups”.]
  • In 2023 we took it in the “.1” of the school year and in all prior years in the “.2”.  If we are being technical, therefore, “.9” would actually be the truest measure of growth since the time frame is “.1” less.  For the purposes of this analysis, I am going round “.9” up and consider it a “year’s” worth of growth.

Here are the cohort snapshots:

What does this snapshot of current Grade 4s reveal?

  • Huge growth in Reading, Vocabulary and Writing Conventions.
  • Better context for Spelling.  Last week, we shared that Grade 4 Spelling (3.4) was one of only two instances out of thirty-six of scoring below grade-level across the whole school.  Here we can see that despite that (relatively) “low” score that annual growth is intact.  That’s the positive.  On the other hand, in order for this score to fully catch up to our school’s expectations, it will have grow more than one year at a time over the next few years.
  • Better context for Math.  Although both of this year’s current scores are above grade-level expectation, we did not see the growth we would expect.  This is why we take the tests and provide our teachers with not only the results, but coaching on how to use the results.  Our Grade 4 Math Teacher now has the data she needs to help individual students fill gaps and best prepare students for math success in Grade 5.

What does this snapshot of current 5s reveal?

  • That they are crushing it!  Max scores in all, but one category, along with appropriate growth.
  • Better context for Computation & Estimation.  Both scores are well above grade level, almost-appropriate growth from year to the next, and there is still room to grow.  Let’s go!

What does this snapshot of current Grade 6s reveal?

  • Again, overall really strong scores and mostly strong growth.
  • Better context for Writing Conventions.  It may not max out, but we showed more than a year’s worth of growth.
  • Better context for Spelling.  We already knew that Grade 6 Spelling (5.6) was the other of the two instances out of thirty-six of scoring below grade-level across the whole school.  Now we know that it went down.  Hmmm…this could be an anomaly.  This is why we keep anecdotal records; maybe we’ll learn something about when Grade 6 took this section that helps explain the results.  Or maybe it is something.  Our Middle School Language Arts Teacher will be on it.
  • Better context for Computation & Estimation.  Again, it didn’t max out, but we can see huge growth from last year.

What does this snapshot of current Grade 7s reveal?

  • That they and their teachers are crushing it!
  • Better context for Computation & Estimation.  It shows that even though this score is lower than their other max scores, while still being above grade-level, it grew more than a year’s worth from last year.

No analysis of current Grade 8s needed, just appreciation for three years of near perfection.  Not a bad advertisement for OJCS Middle School.

To sum up this post, we have so much to be proud of in the standardized test scores of these particular cohorts over time.  The Math and Language Arts Teachers in Grades 3-8 have now begun meeting to go through their  CAT*4 results in greater detail, with an eye towards what kinds of interventions are needed now – in this year – to fill any gaps (both for individual students and for cohorts); and how might we adapt our long-term planning to ensure we are best meeting needs.  Parents will be receiving their child(ren)’s score(s) soon and any contextualizing conversations will be folded into Parent-Teacher Conferences.

Stay tuned next week for the concluding “Part III” when we will look at the same grade (different students) over time, see what additional wisdom is to be gleaned from that slice of analysis, and conclude this series of posts with some final summarizing thoughts.

The Transparency Files: CAT*4 Results Part 1 (of 3)

[Note from Jon: If you have either read this post annually or simply want to jump to the results without my excessive background and contextualizing, just scroll straight to the graph.  Spoiler alert: These are the best results we have ever had!]

Each year I fret about how to best facilitate an appropriate conversation about why our school engages in standardized testing (which for us, like many independent schools in Canada, is the CAT*4, but next year will become the CAT*5), what the results mean (and what they don’t mean), how it impacts the way in which we think about “curriculum” and, ultimately, what the connection is between a student’s individual results and our school’s personalized learning plan for that student.  It is not news that education is a field in which pendulums tend to wildly swing back and forth as new research is brought to light.  We are always living in that moment and it has always been my preference to aim towards pragmatism.  Everything new isn’t always better and, yet, sometimes it is.  Sometimes you know right away and sometimes it takes years.

The last few years, I have taken a blog post that I used to push out in one giant sea of words, and broke it into two, and now three parts, because even I don’t want to read a 3,000 word post.  But, truthfully, it still doesn’t seem enough.  I continue to worry that I have not done a thorough enough job providing background, research and context to justify a public-facing sharing of standardized test scores.  Probably because I haven’t.

And yet.

With the forthcoming launch of Annual Grades 9 & 12 Alumni Surveys and the opening of the admissions season for the 2024-2025 school year, it feels fair and appropriate to be as transparent as we can about how well we are (or aren’t) succeeding academically against an external set of benchmarks, even as we are still facing extraordinary circumstances.  [We took the text just a couple of weeks after “October 7th”.]  That’s what “transparency” as a value and a verb looks like.  We commit to sharing the data and our analysis regardless of outcome.  We also do it because we know that for the overwhelming majority of our parents, excellence in secular academics is a non-negotiable, and that in a competitive marketplace with both well-regarded public schools and secular private schools, our parents deserve to see the school’s value proposition validated beyond anecdotes.

Now for the annual litany of caveats and preemptive statements…

We have not yet shared out individual reports to our parents.  First our teachers have to have a chance to review the data to identify which test results fully resemble their children well enough to simply pass on, and which results require contextualization in private conversation.  Those contextualizing conversations will take place in the next few weeks and, thereafter, we should be able to return all results.

There are a few things worth pointing out:

  • Because of COVID, this is now only our fifth year taking this assessment at this time of year.  We were in the process of expanding the range from Grades 3-8 in 2019, but we paused in 2020 and restricted 2021’s testing to Grades 5-8.  So, this is the second year we have tested Grades 3 & 4 on this exam at this time of year.  When we shift in Parts 2 & 3 of this analysis to comparative data, this will impact who we can compare when analyze the grade (i.e. “Grade 5” over time) or the cohort (i.e. the same group of children over time).
  • Because of the shift next year to the CAT*5, it may be true that we have no choice, but to reset the baseline and (again) build out comparative data year to year.
  • The ultimate goal is to have tracking data across all grades which will allow us to see if…
    • The same grade scores as well or better each year.
    • The same cohort grows at least a year’s worth of growth.
  • The last issue is in the proper understanding of what a “grade equivalent score” really is.

Grade-equivalent scores attempt to show at what grade level and month your child is functioning.  However, grade-equivalent scores are not able to show this.  Let me use an example to illustrate this.  In reading comprehension, your son in Grade 5 scored a 7.3 grade equivalent on his Grade 5 test. The 7 represents the grade level while the 3 represents the month. 7.3 would represent the seventh grade, third month, which is December.  The reason it is the third month is because September is zero, October is one, etc.  It is not true though that your son is functioning at the seventh grade level since he was never tested on seventh grade material.  He was only tested on fifth grade material.  He performed like a seventh grader on fifth grade material.  That’s why the grade-equivalent scores should not be used to decide at what grade level a student is functioning.

Let me finish this section by being very clear: We do not believe that standardized test scores represent the only, nor surely the best, evidence for academic success.  Our goal continues to be providing each student with a “floor, but no ceiling” representing each student’s maximum success.  Our best outcome is still producing students who become lifelong learners.

But I also don’t want to undersell the objective evidence that shows that the work we are doing here does in fact lead to tangible success.  That’s the headline, but let’s look more closely at the story.  (You may wish to zoom in a bit on whatever device you are reading this on…)

A few tips on how to read this:

  • We normally take this exam in the “.2” of each grade-level year, but this year we took at at the “.1”.  [This will have a slight impact on the comparative data.]  That means that “at grade-level” [again, please refer above to a more precise definition of “grade equivalent scores”] for any grade we are looking at would be 5.1, 6.1, 7.1, etc.  For example, if you are looking at Grade 6, anything below 6.1 would constitute “below grade-level” and anything above 6.1 would constitute “above grade-level.”
  • The maximum score for any grade is “.9” of the next year’s grade.  If, for example, you are looking at Grade 8 and see a score of 9.9, on our forms it actually reads “9.9+” – the maximum score that can be recorded.
  • Because of when we take this test – approximately one-two months into the school year – it is reasonable to assume a significant responsibility for results is attributable to the prior year’s teachers and experiences.  But, of course, it is very hard to tease it out exactly, of course.

What are the key takeaways from these snapshots of the entire school?

  • Looking at six different grades through six different dimensions there are only two instances out of thirty-six of scoring below grade-level: Grades 4 (3.4) and 6 (5.6) Spelling.  This is the best we have ever scored!  Every other grade and every other subject is either at or above or way above.
  • For those parents focused on high school readiness, our students in Grades 7 & 8 got the maximum score that can be recorded for each and every academic category except for Grade 7 Computation & Estimation (7.6).  Again, our Grade 8s maxxed out at 9.9 across the board and our Grades 7s maxxed out at 8.9 across the board save one.  Again, this is – by far – the best we have ever scored.

It does not require a sophisticated analysis to see how exceedingly well each and every grade has done in just about each and every section.  In almost all cases, each and every grade is performing significantly above grade-level.  This is a very encouraging set of data points.

Stay tuned next week when we begin to dive into the comparative data.  “Part II” will look at the same cohort (the same group of students) over time.  “Part III” will look at the same grade over time and conclude this series of posts with some additional summarizing thoughts.

Exam Evolution

Once upon a time all the high schools in our community – both public and private – gave formal exams in Grade 9.  And so it was not only natural, it was an advantage for students at OJCS to take a series of exams during the Grades 7 and 8 years.  It checked (at least) three meaningful boxes:

  1. Our students learned valuable note-taking, study and organizational skills by going through the process of preparing for an exam.
  2. Our school learned valuable information about what our students did (or didn’t) learn as they were preparing to exit OJCS.  Exams that were able to stretch back across grades allowed OJCS to know not just what students learned that trimester or year, but what they learned while at OJCS.
  3. Our students gained real-world experience that they could utilize in service of the exams they would be taking in Grade 9 (and beyond).

And then…say it with me…COVID.

And ever since, the public high schools have not offered exams in Grades 9 & 10 and do not seem to be on a path towards doing so again.  Private schools in our community do offer exams in Grade 9.   And to the degree that context matters, we did some digging and it is additionally true that other independent schools in our community do offer exams in Grade 8 (or even earlier) and so if that is the water we are swimming in, perhaps it is that simple.  But part of being “independent” is that we get to make the decision for ourselves, and so it begs the question about what ought we do at OJCS if one of our three boxes no longer applies?  Do the other two warrant the energy (and for some students the anxiety) for OJCS to continue to offer exams, and if so, in which grades and subjects?

Zooming out, there are lots of skills and experiences we teach and provide at OJCS that are not necessarily formally carried forward to high school.  I have learned this firsthand as a parent of two OJCS graduates, one now in university and one still in high school.  Those skills – whether they be technological, organizational, public speaking, self-advocacy and many others – may not have had direct application to this (high school) class or another, but have definitely served them well as students.  If we were deciding whether or not to use iPads, or host hackathons, or a million other things based on what will be true in public school in grade nine, we might as well be public school ourselves.  So we feel very comfortable suggesting that whether or not our graduates going on to public schools do or don’t have formal exams in grade nine, it ought not determine what we do.  So much for “Box #3”.

Boxes #1 & 2 still feel very valuable.  While always managing and paying attention to student anxiety and their version of “school/life balance” – and always honouring IEPs and Support Plans – we definitely believe that the process of preparing, studying and taking formal exams is a value add for our students as they prepare for the added rigours of high school.  Grit and resiliency can only come about through authentic experience; sometimes you have to be a little uncomfortable, suffer a little adversity, be a little anxious.  So there’s “Box #1”.

Box #2 is interesting and at least for this year (and likely next) determinative.  We have lots of opportunities to utilize external benchmarks and standardized testing to provide data on what students who are graduating OJCS have (and haven’t) learned.  We have the most data on Math and Language Arts by virtue of the CAT-4, Amplify, IXL, etc.  If we wanted to gather similar results for Social Studies and/or Science we could decide if and when to add those modules to our CAT-4.  The two places where we could benefit from better knowledge is in Jewish Studies and French.  We have made significant progress in knowing what is true in French with last year’s introduction of the DELF Exam, but it only targeted the highest achieving students.  No such external standard exists for Hebrew / Jewish Studies.

And so for all of the above reasons, here is what will be true this Spring at OJCS.  Students in Grade 8 will take two exams.  They will all take a Jewish Studies Final (which is completely consistent with past and present practice) and they will take either a French Final or the DELF (the “French Final” being an in-house exam offered at both the Core and Extended (if needed) levels).  We’ll see how that goes, check results, solicit feedback and make any adjustments if needed for future years.

And with this totally normal little blog post in the middle of what is still a very complicated world and time…Winter Break.  See you 2024.

OJCS Faculty Pre-Planning 2023: Connecting the Dots

We’re back! 

This has been an amazing Faculty Pre-Planning Week that has us poised for our biggest and best year yet!  Our teachers consist of one group of amazing returning teachers, and another group of talented new teachers, and the combination is magical.  A school is only as good as its teachers, so…OJCS is in good hands, with all arrows pointing up.  Enrollment is still coming in, and I can safely say that we will be a larger school than the year before for the sixth consecutive school year.

Do you ever wonder how we spend this week of preparations while y’all are busy getting your last cottage days or summer trips or rays of sun in?  

I think there is value in our parents (and community) having a sense for the kinds of issues and ideas we explore and work on during our planning week because it foreshadows the year to come.  So as you enjoy those last days on the lake or on the couch, let me paint a little picture of how we are preparing to make 2023-2024 the best year yet.

Here’s a curated selection from our activities…

The “Connecting the Dots”  Cafe

Each year (16 years, 7 at OJCS and counting!), I begin “Pre-Planning Week” with an updated version of the “World Café”.  It is a collaborative brainstorming activity centered on a key question.  Each year’s question is designed to encapsulate that year’s “big idea”.  This year’s big idea?  Connecting the Dots!

With a growing school with so many departments, languages, programs, etc., in order to make sure our students, teachers and parents are able to experience OJCS as holistic human beings and to benefit from all we have to offer, we will aim this year to forge the connections, break out of the silos, simplify and streamline where appropriate, facilitate the communication and do less even better.

Here’s what connected collaboration looks like…

Conscious Leadership

Get used to hearing your children locating themselves “above” or “below the line” as we introduced some key ideas from The 15 Commitments of Conscious Leadership – read this summer by the Admin – to our fuller faculty.  Every now and again we introduce new “frameworks” that provide a shorthand, a vocabulary, and culture that allows our teachers and our students to make sense of themselves and the world.  The big ideas of “Conscious Leadership” are completely anchored in our North Stars, what we believe to be true about children, the way we think and talk about “regulation”, and along with those other values and ideas, will continue to professionalize ourselves and upgrade our engagement with parents and students.  Do you want to learn along with us?  Check out the following and see if and how you might apply it to either your professional and/or parenting lives:

Next time you have to have a difficult conversation, just let us know if we are bringing you “below the line” and we can help make that positive “shift”.

Connecting the Dots: Behaviour Support @ OJCS

This will be big, the focus of attention at Back to School Night (9/19 @ 7:00 PM), and the subject of its own blog post in the weeks ahead, so please just consider this a “teaser”.  But you should also “connect the dots” between what I wrote near the end last year in my post sharing the results of the Annual Parent Survey:

The one metric that I am disappointed to see take a dip down after three straight positive years is the last one, which essentially serves as a proxy for school-wide behavior management.  Four years ago we scored a 6.69 and I stated that, “we are working on launching a new, school-wide behavior management system next year based on the “7 Habits” and anchored in our “North Stars”.  I will be surprised if this score doesn’t go up next year.”  Well, three years ago it came in at 7.65, two years it climbed up to 8.19, and it remained high at 7.85 last year.  6.73 puts at back at square one – even if it rounds into the acceptable range, and even with a small sample size.  Parents at OJCS can expect to see significant attention being paid to overall behavior management in 2023-2024.

“Significant attention” has been and is being paid.  You can see it reflected in staffing and you will see it reflected here.  For now, remember…

…and know that…

…thanks to the hard work of a lot of people, our new framework is poised to make this our best year yet.  Curious?  Want to know more?  Stay tuned!

Did I do one of my spiritual check-ins on the topic of the “Comfort & Community”?  Sure did!

Did Mrs. Reichstein and Ms. Beswick lead a session on “Bringing the IEP to Life”?

Did Mrs. Bennett, Mr. Max, Mrs. Thompson and I provide differentiated instruction on best practices for Classroom Blogs & Student Blogfolios?  Yessiree!

Did the OJCS Makerspace Team facilitate a hands-on creative session for teachers in the Makerspace now that it is becoming a hub for innovation at OJCS?  (This work is a direct result of an Innovation Capacity Grant from the Jewish Federation of Ottawa!)  Yup!

Did Ms. Gordon go over all the guidelines and protocols and procedures and rules and mandates to keep us all in the know?  No doubt!

Did our teachers have lots of time to meet and prepare and collaborate and organize and do all the things needed to open up school on Tuesday?  And then some!

All that and much more took place during this week of planning.  We are prepared to provide a rigorous, creative, innovative, personalized, and ruach-filled learning experience for each and every one of our precious students who we cannot wait to greet in person on the first day of school!

Wishing you and yours a wonderful holiday weekend and a successful launch to the 2023-2024 school year…

BTW – want to hear from our own teachers about who they are and how excited they are for this year?  Introducing our first podcast of the year… Meet the OJCS faculty!  Give our podcast a listen and reply below to let us know what you are most excited about this year!

OJCS Celebrates Innovation Day

One of my great joys over the last six (!) years has been watching the evolution from “Science Fair” to “STEM” to “STEAM” to “Innovation Day”.  Each iteration has brought our school – and our students – closer to a high dream of fusing science benchmarks with STEAM (Science-Technology-Engineering-Art-Mathematics) standards with Makerspace skills all blended together with creativity, student voice and presentation rubrics to become this thing that we now call “Innovation Day”.

However, we might as well have called it #OJCSNorthStarsDay since a day like this reaches so close to so many of them…

…”We learn better together”?  We sure did today as, for many, collaboration was the key to innovation.

…”We own our own learning”?  Students had lots of opportunities for choice-making which inspired their creativity.

…”A floor, but no ceiling”?  The sky was the limit as to how high they chose to aspire.

…”Ruach”?  Did they have fun?  Check out the smiles below and tell me.

I want to be super clear and name that not only did I have virtually nothing to do with the planning and facilitation of this day, I also had virtually nothing to do with the documentation of this day as well.  It is my pleasure to use my blog to showcase the work of those who did.

The primary drivers of Innovation Day at OJCS were Josh Ray, who serves as our Makerspace Lead and Middle School Science Teacher, and our Lower School Science Teachers.  Everything that you are going to see below is the fruit of their labours – with photo collages captured by Staci Zemlak-Kenter, who dabbles in social while serving as our Development Director.   Together with Global Maker Day and the regularly scheduled lessons in our Makerspace, Innovation Day shows how OJCS serves as an incubator of innovation for it students (and teachers!).

So.  What was this day all about?

In a nutshell…this:

Grade 8 – Simple Machines Project

We often say that doing something with a machine requires less work. In this design challenge, you will be responsible for helping upgrade the gymnasium and physical education programming here at OJCS. Using your knowledge of simple machines, your task is to use the design thinking process to design, test, and build a simple machine prototype that enhances our physical education curriculum.

Your Goal: Working on your own or in a group, decide which simple machine game or project template you will use to build your project. After researching the six different types of simple machines, create a plan for your prototype. Determine what materials you will need and, the size and quantity of materials. Then, plan how you will proceed. All sections will be presented as part of a 5 section (Empathize, Define, Ideate, Prototype, Test) presentation to be displayed on a tri-fold board and presented in front of judges.

Grade 7 – Filtration Project

You are now working for the Clean Water Environmental Engineering Company and have been asked to design a new water filtration system for a small community with a polluted water supply. First, the company is going to look at different types of filter material to determine which ones work well. Then each group in the company will design a filtering system to clean up the polluted water.

Your Goal: In this hands-on project, you will investigate different filtering methods for removing pollutants from a dirty water mechanical mixture. You will design, build and test your own water filters.

Grade 6 – Electricity Project

Your Goal: In this project, you will build a series circuit that lights a bulb using a power source and conducting wires. Then predict what will happen to the brightness of your bulb if you add more bulbs or batteries to your series circuit, and test your prediction.

Grade 5

Grade 4 – Medieval Times & Pulleys/Gears

Grade 3

Grade 2

And our JK, SK & Grade 1s celebrated all things “Science” as well!

Did our students have an amazing day putting all their passion, talent, knowledge and creativity to good use?

I’d say “yes” – this was a great day of learning at OJCS!

La célébration de la semaine de la Francophonie 2023

While our teachers and parents are busy participating in this term’s parent-teacher conferences, I’m going to take a peek forward in anticipation of what should be a very exciting week at the Ottawa Jewish Community School.  Let me be the first to welcome you to the inaugural La célébration de la semaine de la Francophonie, featuring our second annual – but first with parents – Francofête.  [For a bit of background, you are welcome to revisit last year’s post about Francofête and how it builds upon past celebrations of French at OJCS.]

We are so pleased to let you know that next week (March 20-24) will be “La célébration de la semaine de la Francophonie 2023”!  The goals are simple – to spend a week marinating in French, celebrating the work of our students and teachers, highlighting the strides our French program has taken in the last few years, and elevating French beyond the boundaries of French class, into the broader OJCS culture.  The highlight will be the Francofête on Thursday, March 23rd at 6:30 PM in the OJCS Gym.

So…what to expect from “La célébration de la semaine de la Francophonie 2023”?

  • To set the ambience, we will have a customized French music playlist to greet our students each day upon entry and announcements and anthems en français.
  • On Monday, students will experience special activities and programs during their French classes.  This will include the dix mots de la francophonie (the ten words for this year’s francophonie).  What are they?  Glad you asked!  Learn along with our students:
    • Année-lumière
    • Avant-jour
    • Dare-dare
    • Déjà-vu
    • Hivernage
    • Lambiner
    • Plus-que-parfait
    • Rythmer
    • Synchrone
    • Tic-tac
  •  Students will also learn about l’Organisation Mondiale de la Francophonie dans le monde (World Organization of La Francophonie) and Canada’s role therein.
  • On Tuesday, we will hold a major dress rehearsal for the Francofête.
  • On Wednesday, we will take the last ninety minutes of the day for a school-wide “station-to-station” program with our Grades 7 & 8 students guiding our students to teacher-led activities featuring our very own pop-up OJCS French Café where they will enjoy authentic (kosher) French treats.
  • Thursday brings us the Francofête!  Parents will be welcome to join us at 6:30 PM and each of our grades will share songs, dances, knowledge and the joie d’apprendre that comes with French learning at OJCS.
  • We’ll finish the week with a special round of French Reading Buddies!

And many more surprises…

So there you go…voilà!

Parents at OJCS will hopefully look forward to lots of opportunities to peek in and/or to see pictures and videos during this year’s celebration and to join us for the Francofête.  We’ll look forward to building on this in future years as we continue to showcase French in our trilingual school.

Great appreciation to our entire French Faculty and to Madame Wanda in particular who has led this year’s celebration.  This should be a week filled with ruach – errr…joie de vivre! [French North Star Alert!]

A Ruach Week Trip Around the OJCS Student Blogfolio-Sphere

I can assure you that this regular reminder of our student blogfolios with its concomitant plea for your visitation is not a function of being out of ideas of what to say (or because I have been busy making costume changes all Ruach Week!).  It is also not a function of believing that blogging is the primary or most important thing that we do at OJCS – it is not.  But because blogs and blogfolios do makeup the spine of which much else is built around; and because they are outward facing – available for you and the general public to read, respond and engage with – I do want to make sure that I keep them top of mind.

For most of my professional life, I have had two children in (my) schools where they maintained blogfolios.  I subscribed to them, of course, but I am not going to pretend that I read each and every posting, and certainly not at the time of publication.  So this is not about shaming parents or relatives whose incredibly busy lives makes it difficult to read each and every post.  As the head of school where blogfolios are part of the currency, I try to set aside time to browse through and make comments – knowing that each comment give each student a little dose of recognition and a little boost of motivation.  But I am certainly not capable of reading each and every post from each and every student and teacher!

When I am able to scroll through, what I enjoy seeing the most is the range of creativity and personalization that expresses itself through their aesthetic design, the features they choose to include (and leave out), and the voluntary writing.  This is what we mean when we talk about “owning our own learning” and having a “floor, but not a ceiling” for each student.  [North Star Alert!]

It is also a great example of finding ways to give our students the ability to create meaningful and authentic work.  But, it isn’t just about motivation – that we can imagine more easily.  When you look more closely, however, it is really about students doing their best work and reflecting about it.  Look at how much time they spend editing.  Look at how they share peer feedback, revise, collaborate, publish and reflect.

Even having come out of COVID-functioning, our classroom blogs and student blogfolios remain important virtual windows into the innovative and exciting work happening at OJCS.  In addition to encouraging families, friends and relatives to check it out, I also work hard to inspire other schools and thought-leaders who may visit my blog from time to time to visit our school’s blogosphere so as to forge connections between our work and other fellow-travelers because we really do “learn better together” [North Star Alert!]

So please go visit our landing page for OJCS Student Blogfolios.  [Please note that due to privacy controls that some OJCS students opt for avatars instead of utilizing their first names / last initials which is our standard setting.  That may explain some of the creative titles.]

Seriously go!  I’ll wait…

English, French and Hebrew; Language Arts, Science, Math, Social Studies, Jewish Studies and so much more…our students are doing some pretty fantastic things, eh?

I will continue to encourage you to not only check out all the blogs on The OJCS Blogosphere, but I strongly encourage you to offer a quality comment of your own – especially to our students.  Getting feedback and commentary from the universe is highly motivating and will help this snowball grow as it hurtles down the hill of innovative learning.

The Transparency Files: CAT4 Results Part 3 (of 3)

Welcome to “Part III” of our analysis of this year’s CAT4 results!

In Part I, we provided a lot of background context and shared out the simple results of how we did this year.  In Part II, we began sharing comparative data, focusing on snapshots of the same cohort (the same group of children) from 2019 to 2021 (with bonus data from 2018’s Grade 3).  Remember, based on which grades have taken the CAT4 when, we were only able to compare at the cohort level from 2019’s Grades 3-5 to 2021’s Grades 5-7 to 2022’s Grades 6-8.  [Remember, that we did not take them at all in 2020 due to COVID.]  In the future, that part of the analysis will only grow more robust and meaningful.  We also provided targeted analysis based on cohort data.

Here, in Part III, we will finish sharing comparative data, this time focusing on snapshots of the same grade (different groups of children).  We are able, now, to only provide data on Grades 5-8 (from 2019, 2021, & 2022, with bonus data from 2018’s Grade 6), but in future years we’ll be able to expand this analysis downwards.

Here is a little analysis that applies to all four snapshots:

  • Remember that any score that is two grades above ending in “.9” represents the max score, like getting a “6.9” for Grade 5.
  • We are no longer comparing the same children over time, as when it comes to analyzing a cohort, therefore we aren’t looking for the same kinds of trajectories or patterns in the data.  You could make a case – and I might below – that this part of the data analysis isn’t as particularly meaningful, but we go into it open to the idea that there may be patterns or outliers that jump out and warrant a thoughtful response.
  • As we have mentioned, the jump between 2019 and 2021 might have been the place one would have expected to see a “COVID Gap” (but we largely did NOT) and between 2021 and 2022 one might expect to see a “COVID Catch-Up”.

Here are the grade snapshots:

What do these grade snapshots reveal?

  • Again, keeping in mind that we are not tracing the trajectory of the same students, outliers like “Spelling” and “Computation & Estimation” for Grade 7 in 2021 help us understand that whatever is happening there is more a function of the cohort than the grade, which means that the remedy or intervention, if needed, has less to do with the curriculum or the program in Grade 7 and more to do with better meeting the needs of that that particular cohort of children.  [And you can see how that played out and with what results by cross-checking with the cohort data in Part II.]  To be clear we aren’t suggesting that the only explanation for their outlier status is about them that it is the children’s fault!  The deeper dive into the data helps clarify that this is not a “Grade 7” issue, it doesn’t absolve us from better understanding or applying a remedy.
  • You can see a little of the reverse by looking at “Computation & Estimation” in Grade 6.  Now, in this case we are only dealing with being at grade-level or above, but you can see that Grade 2021’s relatively higher score (7.7) is an outlier.  If the goal was to have each Grade 6 score nearly a grade-and-a-half above – which is certainly doesn’t have to be – you would look at the data and say this is a Grade 6 issue and we’d be looking at how students come out of Grade 5 and what we do in the beginning of Grade 6.  Again, this is not about intervening to address a deficit, but I use it to point out how we can use the data to better understand outliers and patterns.
  • To the degree that this data set is meaningful, the trajectory that feels the most achievable considering we are dealing with different children is what you see in Grade 5 “Computation & Estimation” – small increases each year based on having identified an issue an applying an intervention.
  • The bottom line is essentially the same as having viewed it through the cohort lens: almost each grade in almost every year in almost each area is scoring significantly above its grade-level equivalencies.

Current Parents: CAT4 reports will be coming home this week.  Any parent for whom we believe a contextual phone call is a value add has, or will, be contacted by a teacher.

The bottom line is that our graduates – year after year – successfully place into the high school programs of their choice.  Each one had a different ceiling – they are all different – but working with them, their families and their teachers, we successfully transitioned them all to the schools (private and public) and programs (IB, Gifted, French Immersion, Arts, etc.) that they qualified for.

And now again this year, with all the qualifications and caveats, and still fresh out of the most challenging set of educational circumstances any generation of students and teachers have faced, our CAT4 scores continue to demonstrate excellence.  Excellence within the grades and between them.

Not a bad place to be as we open the 2023-2024 enrollment season…

If you want to see how all the dots connect from our first Critical Conversation (Jewish Studies), our second Critical Conversation (French), our CAT4 results, and so much more…please be sure to join us for our third and final Critical Conversation, “The ‘Future’ of OJCS” on Thursday, February 9th at 7:00 PM.

The Transparency Files: CAT4 Results Part 2 (of 3)

Welcome to “Part II” of our analysis of this year’s CAT4 results!

In last week’s post, we provided a lot of background context and shared out the simple results of how we did this year.  Here, in our second post, we are now able to begin sharing comparative data, focusing on snapshots of the same cohort (the same group of children) from 2019 to 2021 (with bonus data from 2018’s Grade 3).  In other words, for now based on which grades have taken the CAT4 when, we can only compare at the cohort level from 2019’s Grades 3-5 to 2021’s Grades 5-7 to 2022’s Grades 6-8.  [Remember, that we did not take them at all in 2020 due to COVID.]  In the future, this part of the analysis will only grow more robust and meaningful.

Here is a little analysis that will apply to all three snapshots:

  • Remember that any score that is two grades above ending in “.9” represents the max score, like getting a “6.9” for Grade 5.
  • Bear in mind, that the metric we are normally looking at when it comes to comparing a cohort over time is whether or not we see at least one full year’s growth (on average) each year – here we are factoring an expected two full year’s growth between 2019 and 2021.  As we discussed last year, that might have been the place one would have expected to see a “COVID Gap” (but we largely did NOT) and between 2021 and 2022 one might expect to see a “COVID Catch-Up”.

Here are the cohort snapshots:

What does this snapshot of current Grade 6s reveal?

  • They consistently function a full grade if not not more above the expected grade level.
  • That even with COVID we consistently see at least a year’s worth of growth each year across almost all the topics.
  • Technically, there is only six month’s worth of growth “Mathematics” (6.9 to 7.5) from 2021 to 2022, but that is already significantly above grade level.
  • The one domain, Computation & Estimation, where they are barely below grade level (6.0), we can now properly contextualize by noting that they grew from 4.4 in 2021 to 6.0 in 2022 – more than a year’s worth of growth in a year (the year we would expect a bit of “COVID Catch-Up”.  This means, that they should be more than on track to match all the rest of their scores being significantly above grade level when they take the text in 2023.

All in all…excellent news and trajectory for our current Grade 6s.

What does this snapshot of current Grade 7s reveal?

Not much!  This cohort has maxed out their scores in almost every domain in almost each year!  And in the few places they did not, they were still above grade level – like “Spelling” (4.9) and “Computation & Estimation” (5.5) in 2019 – and grew at least a full grade level each year so that by now, in Grade 7, it is max scores all across the board!  That is pretty awesome to see.

What does this snapshot of current Grade 8s reveal?

This class had a bit of stranger trajectory, but essentially ends where we would like.  “Spelling” took a strange path, beginning way above grade level, plateauing with a dip where we should have seen two years worth of growth, and now fully rebounding to grade level.  “Computation” had a more normal curve, but went from being consistently a year below grade level before completely catching up and now being well above.

To sum up this post, we have a lot to be proud of in the standardized test scores of these particular cohorts over time.  The two areas (Spelling and Computation & Estimation) that were worthy of prioritization the last couple of years (this year’s Grades 6 & 8) were indeed prioritized.   We began providing professional growth opportunities for language arts teachers in our school on Structured Word Inquiry as part of larger conversation about the “Science of Reading”.  [Please check out our Director of Special Needs, Sharon Reichstein’s recent post on this issue, which I’ll also have more to say about in Part III.]  With regard to Computation & Estimation, we discussed it during last year’s November PD Day which focused on “Data-Driven Decision Making” and it has continued to be a point of emphasis.  The results indicate that these efforts have borne fruit.

The Math and Language Arts Teachers in Grades 3-8 have now begun meeting to go through CAT4 results in greater detail, with an eye towards what kinds of interventions are needed now – in this year – to fill any gaps (both for individual students and for cohorts); and how might we adapt our long-term planning to ensure we are best meeting needs.

Stay tuned next week for the concluding “Part III” when we will look at the same grade (different students) over time, see what additional wisdom is to be gleaned from that slice of analysis, and conclude this series of posts with some final summarizing thoughts.

The Transparency Files: CAT4 Results Part 1 (of 3)

As committed to “transparency” as I am, I find myself growing more and more ambivalent each year about how to best facilitate an appropriate conversation about why our school engages in standardized testing (which for us, like many independent schools in Canada, is the CAT4), what the results mean (and what they don’t mean), how it impacts the way in which we think about “curriculum” and, ultimately, what the connection is between a student’s individual results and our school’s personalized learning plan for that student.  It is not news that education is a field in which pendulums tend to wildly swing back and forth as new research is brought to light.  We are always living in that moment and it has always been my preference to aim towards pragmatism.  Everything new isn’t always better and, yet, sometimes it is.  Sometimes you know right away and sometimes it takes years.

I have already taken a blog post that I used to push out in one giant sea of words, and over time broke it into two ,and now three parts, because even I don’t want to read a 3,000 word post.  But, truthfully, it still doesn’t seem enough.  I continue to worry that I have not done a thorough enough job providing background, research and context to justify a public-facing sharing of standardized test scores.  Probably because I haven’t.  [And that’s without factoring in all the COVID gaps that come along with it.]

And yet.

With the forthcoming launch of Annual Grades 9 & 12 Alumni Surveys and the opening of the admissions season for the 2023-2024 school year, it feels fair and appropriate to be as transparent as we can about how well we are (or aren’t) succeeding academically against an external set of benchmarks, even as we are still just freshly coming out facing extraordinary circumstances.  That’s what “transparency” as a value and a verb looks like.  We commit to sharing the data and our analysis regardless of outcome.  We also do it because we know that for the overwhelming majority of our parents, excellence in secular academics is a non-negotiable, and that in a competitive marketplace with both well-regarded public schools and secular private schools, our parents deserve to see the school’s value proposition validated beyond anecdotes.

Now for the annual litany of caveats and preemptive statements…

We have not yet shared out individual reports to our parents.  First our teachers have to have a chance to review the data to identify which test results fully resemble their children well enough to simply pass on, and which results require contextualization in private conversation.  Those contextualizing conversations will take place in the next few weeks and, thereafter, we should be able to return all results.

There are a few things worth pointing out:

  • Because of COVID, this is now only our fourth year taking this assessment at this time of year.  We were in the process of expanding the range from Grades 3-8 in 2019, but we paused in 2020 and restricted last year’s testing to Grades 5-8.  This means that we can only compare at the grade level from 2019’s Grades 5-8 to 2021’s Grades 5-8 to 2022’s Grades 5-8.
  • And we can only compare at the cohort level from 2019’s Grades 3-5 to 2021’s Grades 5-7 to 2022’s Grades 6-8.
  • This is the first year we have tested Grades 3 & 4 on this exam at this time of year.
  • From this point further, assuming we continue to test in (at least) Grades 3-8 annually, we will soon have tracking data across all grades which will allow us to see if…
    • The same grade scores as well or better each year.
    • The same cohort grows at least a year’s worth of growth.
  • The last issue is in the proper understanding of what a “grade equivalent score” really is.

Grade-equivalent scores attempt to show at what grade level and month your child is functioning.  However, grade-equivalent scores are not able to show this.  Let me use an example to illustrate this.  In reading comprehension, your son in Grade 5 scored a 7.3 grade equivalent on his Grade 5 test. The 7 represents the grade level while the 3 represents the month. 7.3 would represent the seventh grade, third month, which is December.  The reason it is the third month is because September is zero, October is one, etc.  It is not true though that your son is functioning at the seventh grade level since he was never tested on seventh grade material.  He was only tested on fifth grade material.  He performed like a seventh grader on fifth grade material.  That’s why the grade-equivalent scores should not be used to decide at what grade level a student is functioning.

Let me finish this section by being very clear: We do not believe that standardized test scores represent the only, nor surely the best, evidence for academic success.  Our goal continues to be providing each student with a “floor, but no ceiling” representing each student’s maximum success.  Our best outcome is still producing students who become lifelong learners.

But I also don’t want to undersell the objective evidence that shows that the work we are doing here does in fact lead to tangible success.  That’s the headline, but let’s look more closely at the story.  (You may wish to zoom in a bit on whatever device you are reading this on…)

A few tips on how to read this:

  • We take this exam in the “.2” of each grade-level year.  That means that “at grade level” [again, please refer above to a more precise definition of “grade equivalent scores”] for any grade we are looking at would be 5.2, 6.2, 7.2, etc.  For example, if you are looking at Grade 6, anything below 6.2 would constitute “below grade level” and anything above 6.2 would constitute “above grade level.”
  • The maximum score for any grade is “.9” of the next year’s grade.  If, for example, you are looking at Grade 8 and see a score of 9.9, on our forms it actually reads “9.9+” – the maximum score that can be recorded.
  • Because of when we take this test – approximately two months into the school year – it is reasonable to assume a significant responsibility for results is attributable to the prior year’s teachers and experiences.  But, of course, it is very hard to tease it out exactly, of course.

What are the key takeaways from these snapshots of the entire school?

  • Looking at six different grades through six different dimensions there are only five instances out of thirty-six of scoring below grade-level: Grade 3 (Vocabulary 2.2, Writing Conventions 2.5, and Spelling 2.6), Grade 5 (Computation Estimation 4.6), and Grade 6 (Computation Estimation barely falling short at 6.0).
  • I’m not quite sure what to make of Grade 3’s Language Arts scores altogether.  Reading and Writing has been the most notable lagging skill for the Grade 3 cohort since their entry into Grade 2.  This is in part due to disruptions to their learning through their foundation-building years in Kindergarten and Grade 1. In Grade 2, this cohort’s remediation was heavily focused on closing the gaps in reading and comprehension abilities, as developmentally this is what comes first.  The remediation focus has shifted to writing at the start of Grade 3, as this is a lagging skill that was already identified prior to the CAT-4 testing.  Supports and interventions have already been put in place to address this lagging skill and we have seen academic growth in these areas.  To put it more simply: These are our youngest students whose early learning was the most disrupted by COVID and they have never taken a standardized test before in their lives.  It will become a baseline that I imagine us jumping over quickly in the years to come – I’m inclined to toss them out as an anomaly.
  • Importantly, tracing the trajectory from our 2019 results to our 2021 results to 2022’s, we can now more conclusively state that Spelling and Computation & Estimation are no longer globally lower as a school relative to the other dimensions.  I will have more to say about why we believe this to be true in Parts II & III.

What stands out the most is how exceedingly well each and every grade has done in just about each and every section.  In almost all cases, each and every grade is performing significantly above grade-level.  This is a very encouraging set of data points.

Stay tuned next week when we begin to dive into the comparative data.  “Part II” will look at the same cohort over time.  “Part III” will look at the same grade (the same group of students) over time and conclude this series of posts with some additional summarizing thoughts.