The designs of many student-facing learning analytics (SFLA) dashboards are insufficiently informed by educational research and lack rigorous evaluation in authentic learning contexts, including during remote laboratory practical work. In this article, we present and evaluate an SFLA dashboard designed using the principles of formative assessment to provide feedback to students during remote lab activities. Feedback is based upon graphical visualizations of student actions performed during lab tasks and comparison to expected procedures using TaskCompare-our custom, asymmetric graph dissimilarity measure that distinguishes students who miss expected actions from those who perform additional actions, a capability missing in existing graph distance (symmetrical dissimilarity) measures. Using a total of N = 235 student graphs collected during authentic learning in two different engineering courses, we describe the validation of TaskCompare and evaluate the impact of the SFLA dashboard on task completion during remote lab activities. In addition, we use components of the motivated strategies for learning questionnaire as covariates for propensity score matching to account for potential bias in self-selection of use of the dashboard. We find that those students who used the SFLA dashboard achieved significantly better task completion rate (nearly double) than those who did not, with a significant difference in TaskCompare score between the two groups (Mann-Whitney U = 453.5, p < 0.01 and Cliff's delta = 0.43, large effect size). This difference remains after accounting for self-selection. We also report that students' positive rating of the usefulness of the SFLA dashboard for completing lab work is significantly above a neutral response (S = 21.0 and p < 0.01). These findings provide evidence that our SFLA dashboard is an effective means of providing formative assessment during remote laboratory activities.