Make Learning Visible

FINDINGS of our Case study

"Make Learning Visible"


We wanted to find a way how to implement MLV in an educational process and improve the learning experience for students.

From 2015 to 2018 we conducted a study in order to assess the implications of "Make Learning Visible".

We found out:
Make Learning Visible changes everything.


the frame –
A private tutoring school in Zurich.


Running experiments in education is fairly difficult. Who wants to be a test subject for an unproven concept in a matter as important as education? Building up trust with educational institutions takes as much time as building your own school. So we founded Zürcher Nachhilfe LLC in 2014 based on three principles:

Responsibility - Communication

The principles and the newly founded school provided us with a starting point to develop ideas and new approaches in education.

With these core principles in mind the school developed which in turn gave us critical insight into institutionalised conveyance of educational content. While at the beginning the focus lay on running and building the school as a business it soon became apparent that the core principles can be taken a step further.

One main reason for the initial success of our tutoring school was its approach to transparency and communication. With restricted funds we relied on a simple yet effective method to bring light into the so often murky process of studying.

We wrote short feedbacks of 50 words after each lesson we provided. Parents would receive them by mail and soon we developed a web-platform where they could be read in a more timely fashion.

This was our first approach to Make Learning Visible, with a qualitative method.

The urge was there to develop a second way of Making Learning Visible, a quantitative way.

In our third exam-prep course in 2016 we validated our method of collecting performance data. In 2017 we started working with a light version of Make Learning Visible with no graphical interface. In 2018 we will be offering a working prototype to our teacher staff with a graphical user-interface that supports decision making and finally Makes Learning truly Visible.

Over the course of these four years it became apparent that students, teachers, parents and institutions alike can improve their work significantly if they work with qualitative and quantitative means to measure student, teacher and insitution performance.

The school never sought any form of investment or financial support and is self sufficient to this day.



Years of experience


Lessons since 2014



Feedbacks since 2014


Teachers employed


the method –
An example of practical research in education


Validating our approach of "Make Learning Visible" took three years. We were able to gather data only once a year and over the course of 6 months.

We restricted our field of research to a test-prep course which we conduct once a year for a standardised exam in Zurich. Students are between 14 and 16 years old and take the exam in Math, German and French. 

As part of our research we conducted customer surveys after each course cycle in order to measure the impact of our work. These customer surveys were conducted with the parents who sent their children to our exam preparation course.

We wanted to assess how "Make Learning Visible" impacts perceived student performance as well as overall satisfaction with the course. The study has a dynamic character in the sense that the test environment was not static. Each year since 2015 'Make Learning Visible' evolved thus having a different impact on the course each year.

In 2015 we ran a course without collecting data and without the MLV-Approach in mind. We conducted the first standardised survey with this test group.

In 2016 we data-enabled the course with students writing mock-exams. In this course we didn't provide teachers nor parents or students with any data. The focus was to test the process of data collection. That alone made the ratings improve by a significant measure.

In 2017 we ran the second data-enabled course and were already changing the way we teach accordingly. We used the intelligence of the data collected and compiled the data in PDF-Sheets in an early attempt to Make Learning Visible.

In 2018 we will be able to offer a dashboard for students, parents and teachers in order to get a clear picture of the development. We are also capable of making predictions on the exam outcome with our statistical model. This is rendered possible through our continuous method of data collection.


Results of the zn customer survey (Parents)

"On a scale from 1 to 10, how likely ist it that you recommend this service to friends and relatives?"

2015 – Average: 8.3 | NPS: 37
2016 – Average: 9.2 | NPS: 56
2017 – Average: 9.0 | NPS: 62

"On a scale from 1 to 10, how would you rate the progress of your child during the course?"

2015 – Average: 6.8
2016 – Average: 7.8
2017 – Average: 7.3


the results –
Insights not only through analysis of datasets. 


Student attendance as well as satisfaction ratings continuously improved over the years. Whether this is to be attributed to our method of Making Learning Visible or if it is due to the fact that we got better at running a private school is not clearly identifiable.

However running a private tutoring school gave us insights into the needs of parents, students and teachers in a most direct way. Because of the needs we were able to identify we were able to direct our approach into the direction it has taken.

In our 2017 survey with our students we asked them to finish the following sentence: "If I had to explain the method of Mock-Up Exams to a friend of mine, I would say ...."

The results were very clear and showed that the students understood what Make Learning Visible is about and how it helps them. Most answers were about how the method gave them an insight on how their study progress developed or how it helped them identify issues.

Analysing the data sets themselves helped us understand why certain teachers we employed were better than others.

We were also able to discover a highly positive correlation between German-Grammar proficiency and overall success rate at the exam. This finding delivers a most interesting starting point for further, data-based research.

But is it the fancy-tech that makes learning more effective? No. It is still the people, it is the teachers that are empowered to deliver a more positive experience to students if they are assisted with these systems. It is the positive experience only that makes an impact. The tech behind it just helps to make it happen.



Bridge: ketch /
Paperboat: complize /
Dices: Saimen. /