Release notes provide a summary of the latest updates, enhancements, and bug fixes.
Elevate Survey Update
PERTS is currently conducting an ESSA Tier-3 Study to gather evidence on which actions lead to improvement in Elevate data. Two questions (below) have been added to the end of the Elevate student survey to gather evidence for this study. Each individual student will only see one of them every time they take the survey. The answers to these questions will not appear in reports.
My teacher has talked with my class about our responses to the survey.
My teacher has asked my class for ideas about how to make our class better.
Community and Network reports have updated masking rules for data in the fidelity tables. Unlike learning conditions metrics, fidelity data are only collected from a subset of randomly selected students. Because it is impossible to discern which students answered each fidelity question and the fidelity data are not attached to demographic information, those data are neither identifiable nor quasi-identifiable. Nevertheless, we will begin masking percentages in the fidelity table that summarize data of fewer than five unique students, in order to be more consistent with the way data are presented elsewhere in the reports.
“All Students” is changing to “All” on reports.
Reports may have some data hidden (by listing it as N/A) which is when fewer than five students answered. We currently do this to protect student privacy in your reports.
Reports will include partial responses from survey-takers without complete responses. Learn More
Participation Dashboard “Respondents and Participation Rates by Survey” graph will count unique students instead of enrollments.
In the participation tab, there is a graph called “Respondents and Participation Rates by Survey.” Until now, the bars in this graph have corresponded to enrollments, whereas the line has corresponded to the participation rate. (See our documentation for definitions of these terms.) Going forward, the line will continue to represent the participation rate, but the bars will now correspond to unique respondents instead of enrollments. We believe this will give users a more accurate read on how many distinct students are participating in each cycle, across classes.
As a result of this change, users may notice that their # of Respondents, as reported by bars in this graph, is reduced on October 11, 2021. This is because students who participated through multiple classes are no longer being counted once in each class.
Uncycled responses will not be counted in the participation dashboard.
In the Participation Dashboard, the total number of respondents is summarized in three different places: in the Participation Overview, Participation by Survey, and Participation by Class/Community sections. Until now, the number of Respondents included in the Participation Overview section included responses made outside of the first survey window, whereas the number of responses presented in the other two sections only included responses that correspond to specific survey windows. Typically, pre-survey window responses are “test-runs” where an instructor completes a survey themselves, and do not reflect true participation. We also received feedback from users that having those numbers out of alignment is confusing. Therefore, we will stop counting pre-survey window responses in the Participation Overview going forward.
Users are unlikely to notice much as a result of this change. A few Communities might have the number of respondents in the Participation Overview section go down by a handful or at most a few dozen respondents.