top of page
  • Writer's pictureKevin Graham

Remote Learning Surveys – Spring of 2020 – Summary of findings

In March, I was asked by two Heads of School (one Canadian and one American) to work with their senior teams in the development of short surveys of parents on the topic of remote learning. As often happens, these conversations led to others, and customized surveys of faculty and students were also created.

As the COVID-19 crisis loomed overhead, my phone continued to ring. What followed became a very interesting exploration.

Between early April and early June, I ended up working with 48 independent schools, conducting remote learning surveys of parents, students, and/or teaching faculty. A total of 99 surveys were completed during this period. A small number of schools conducted their surveys twice. In this summary, results are included only once for each school.

Aside from the two originating schools, all other surveys were standardized, containing exactly the same questions: for parents, 8 questions plus an open comment box; for students, 9 questions plus 3 open comment boxes; and for faculty, 12 questions plus 1 open comment box.

I should say that this standardized format was an entirely new experience for me. For more than 25 years, I’ve been conducting customized comprehensive constituent surveys for independent schools, each survey including between 90-140 questions. This work involves extensive dissection of results, typically yielding more than 1,000 pages of tables and graphs for each survey, as well as my own 80 page ‘plain English’ interpretive analysis.

With a growing number of schools on the line this Spring, I had to figure out quickly what we could do to help, if anything, and how to do it without sacrificing our standard of quality. If I can’t do it right, I won’t do it at all. Very quickly I decided that, as a small group, we were in no position to develop customized surveys for a potentially large number of schools during this very tight time frame. Cross-tabbing and the creation of graphs, along with my own work on the analysis of results, for this number of surveys would have taken us well into 2021 if we’d gone the customized survey route. Schools needed top-line results ‘yesterday’ without detailed digging into the data.

The standardizing of survey content accomplished two things. First, it largely put me out of work, since my principal role in the business is the development of customized surveys and the interpretation and presentation of results. Second, it greatly expanded our capacity. With all the parent surveys identical in content, we were able to automate much of the data management side of things. Timelines were dramatically reduced from start to finish. On more than one occasion, I took a call from a Head of School, enquiring about the project, and two days later, they received results. Typically, for a comprehensive survey of parents, it takes six weeks to create, review, and revise (and review and revise, and review and revise) the questionnaire, six weeks to collect the data (with weekly badgering of parents), and four weeks to work through the analytics, placing me in a position to present results.

Unlike schools, the COVID-19 crisis gave me a sabbatical for a few months, with a number of our comprehensive surveys postponed from this past Spring to the Autumn term. So… I had time to take this project on. I like data, lots of data. The more the merrier, so I wanted a large pool of data coming out at the other end. Hoping to encourage more schools to take advantage of the opportunity, I opted to keep the decision simple, and made it available at no cost. No variations (don’t ask!)… but no cost.

Working with a small number of Heads, (special thanks to Nathan, Martha, and Larry), the surveys took shape and were made ready to launch for interested schools.

This summary reflects results from 84 of the standardized surveys (and only one per school per type of survey). This includes: 44 parent surveys; 23 faculty surveys; and 17 student surveys.

From the parent surveys:

Results incorporate participation by 13,200 parents, with an average of 300 per school, a high of 725 and a low of 60. The median was 273. The average rate of participation among families is 79.2%. This is somewhat higher than our typical response rate for comprehensive parent surveys (recently in the range of 73%), owing both to the length of the questionnaire, and to the urgency of the remote learning topic. The response period averaged about 36 hours, including an email from the Head of School on the launch date, and one reminder next day, stating the deadline as midnight on the second day. Results were prepared and sent to the Head of School by noon on the third day.

The table, below, portrays summary results across 44 parent surveys. To read the first row in the table, the average number of participants, across schools, was 300, with a wide range from low to high. The standard deviation (164 in this case) depicts how wide on either side of the average we have to go to include 68% of surveys.

For the second measure, agreement with the statement, “I trust the School’s leadership to make decisions that first protect the health and safety of my child and our family,” the average is 4.5, with a range from 4.3 to 4.8. From the standard deviation, we can say that just over two-thirds of all participating schools were rated for this measure in a very tight range: 4.5 +/- 0.12.

Similarly tight dispersion is revealed for all measures, with the largest spread for the final item in the table, “My child is adapting well so far to the remote learning environment.” Stressing the prohibitively small number within sub-groupings among schools, my observations tell me that ratings for this measure are lower among parents of younger students. This, in turn, may reflect greater challenges on the part of parents of younger students in adapting to remote learning. This makes intuitive sense, but would require more research to confirm.

I should declare the potential limitations of this work:

  • different types of schools – e.g.: boarding, day; single sex, co-ed; very large, very small; full spectrum of grades versus partial; learning differences focus

  • point on the calendar for conducting this survey within an 8-week time frame

  • regional variation

While these ratings all appear strong, it is impossible to say how they would appear in comparison to those from public school parents. All schools participating in this project are independent schools. This said, having conducted more than 450 comprehensive attitudinal surveys for independent schools before this project (174 of them parent surveys), these results look strong to me.

The number of schools within the limiting distinctions described above is too small to break them apart into meaningful cells without abandoning any sense of statistical validity (so, please don’t ask). This said, there was one comparison I was hoping to make, and have been hoping to make for the past 25 years. With the standardized format, I was finally able to make a same-base comparison in attitudes between Canadian parents and American parents. I have been saying through the years that parents are parents are parents, but Heads of School always meet this with a raised eyebrow. In the end, here’s what I found.

Of 44 schools included, 27 are Canadian and 17 American. The average number of parent participants for each survey: Canadian – 302; American – 296.

In comparing the average results across Canadian schools with those across American schools: Not even one of the eight questions revealed differences of more than 0.075 on the rating scale from 1.0 to 5.0. The average of the eight scores for Canadian schools was 4.256 and for American schools, 4.250.

Moreover, for agreement with the statement, “I support the School’s move to remote learning as an interim measure” ratings differed by just 0.0005, and this only by rounding.

So… parents are parents are parents. Nice to know. We’re more the same on either side of this border than we are different. Let’s not forget that this is the longest undefended border in the world… current circumstances not considered.

We have also prepared a table to include answers for all parents across all 44 schools. With 13,200 participants, there is a very strong basis for correlation analysis between answers. The strongest correlation was revealed for the two measures: “I trust the School’s leadership…” and “The School has been proactive in keeping me informed…” at 0.78. There was also strong correlation between “The School has been proactive in keeping me informed…” and “The School has demonstrated appropriate flexibility…”. So… of no surprise, communications, and flexibility in response during uncertain times are key correlates to trust in leadership.

From the student surveys:

Student results from 17 schools are shown in the table, below. Points worth noting include:

  • 5,270 students participated, ranging from Grade 5 to Grade 12.

  • For most measures, the format differed from the parent survey, asking students to choose among short lists of descriptors for various aspects of their remote learning experience.

  • For the majority of schools, students completed the online form in an organized/supervised setting, during classes or advisory sessions. A few schools were unable to administer the survey in this recommended setting, and expectedly experienced lower participation rates. Participation, then, ranged from 39.4% to 100%, with an average of 65.9%. Standard deviation on this count was 21.2%.

From the faculty surveys:

Results from 1,219 teachers at 23 schools are shown in the table, below. The average rate of participation was 90.8%, with a standard deviation of 9.1%. Answer formatting took a mixed approach for faculty, with some 5-point ratings and some descriptor selection ratings.

This was certainly an engaging project, and as I often say, “Engagement is Everything”. Thanks to all the participating schools. I’m grateful for the learning opportunity, and the meaningful activity with which to keep myself busy. I hope it was a worthy experience for you as we continue to explore our meandering journey through these uncertain times.

Life is a journey. Life is good.

With respect,

Kevin Graham

1 view0 comments

Recent Posts

See All

State of the (pickleball) world

As I write this, I'm supposed to be playing pickleball, but I skipped out today, uncertain of my rightful place. Last time out, I was confronted by a woman who suggested, to be blunt about it, that I

Where hope is lost, so is the claim

With authentic Canadian humility (for which, I should add, we are excessively proud), I offer this sincere apology to my many friends south of the border… again. A friend in Florida once admonished me


bottom of page