Having decided to return to full-time studying presented me with an incredible opportunity to make a personal evaluation of our education system and academic institute. In the end, there are few of us - aged 40 and attending courses full-time.
Now, let’s make one thing crystal clear up-front: I’m not going to state that it is a stupid idea to go to the university - or any other form of classical higher education for that part. I’m not sharing the loud online voices that say that education is dead and must be superseded by online courses or any other anarchistic format. Even after returning and seeing a lot of problems with today’s academic ways, I more than ever believe that the university or any other form of higher education is the right place to go and prepare yourself for the future …
… IF it is ready to make some drastic changes … IMHO
Having worked in the industry for over 12 years, I’ve seen young colleagues come and go. Almost every single one of them - including myself back in 2000 - had to go through a five-year period of on the job training. This junior-ship is a phenomenon that is recognized not only by me but by most of my former colleagues. It seems to be one of those golden truths.
This means that companies need to invest large amounts of time and money before a new recruit is actually profitable. This is a shame, because these youngsters also hold the incredible power to be flexible and can come up with very innovative ideas, still being unbiased and naive in the good sense. Often these benefits go down the drain, due to to their inexperience on the basic operational level. Lack of communication skills, being unfamiliar with common organizational tasks - like conducting meetings, analysis,… - or practices - like customer- and goal-orientation - causes them to be pulled into a whirlpool while they adopt these virtues over time.
It’s not hard to see that this is not only a pity for the companies but also for them. Add to this the typical pattern where they often leave their first company within the same period - not only because of financial opportunities but also due to the very valid reason to discover more interesting topics - and it boils down to a very negative experience for all parties.
This search for more interesting topics is in fact a symptom of another problem that originates from the sociocultural obligation for teenagers to make the most important choice in their lives at the age of 18. Maybe the age isn’t that important here, but rather the fact that they have little to no experience to base their decision on at that time.
This problem is even further amplified by the sociocultural idea that grades in high school are directly correlated to the higher education one should pursue. Although I have no empirical evidence to staff this, I tend to believe that a high number of students have talents that might very well lie far outside the academic world, but are never discovered and are lost during the years they try to blend in their expected educational choice.
From a completely different angle, the higher educational institutes also suffer from this situation. Once they release their students into the wide wild world, they completely loose touch. On one hand, the students become employees and feel freed from the educational chains. On the other hand, being submerged in this five-year period of industrial adolescence, they are not able to have a good overview of the world they enter and are not capable of giving valuable feedback to their origins.
The higher educational institutes are therefore deprived of very important feedback of how their work percolates into the real world. Now I know for a fact that they try very hard to keep tight connections to the industry, but the level at which this is done is often not that of the actual education. Both parties try to leverage each others merits in a rather direct way through one-off research projects and temporary placement of doctoral researchers. But the bridge to the actual courses is still very wide.
So students aren’t getting the best support, companies are required to invest in another five-year period of on the job training and the higher educational institutes loose touch with their target.
This thing came to be while I was preparing for my psychology of perception exam. The final chapter of this course dealt with the acquisition of skills. Although I already knew about the 10.000 hours rule of expertise, it once again struck me that to be able to really master a skill, one needs about 10 years of dedicated effort, resources and motivation.
Let’s compare this to the five years that my colleagues have to go through: During these five years they need to accumulate 300 credits - 180 during their bachelor and another 120 during their master. Each credit represents about 25 hours of effort. A simple multiplication learns that during these five years they could dedicate about 7500 hours to becoming a master. So on the effort scale, they hardly reach 75% of the required amount to reach an expert level at all.
Now take into account that during these five years, these students are confronted with courses that hardly ever contribute anything to their future expertise, this effort is even further reduced and in the end they are maybe only half-way on their journey.
And that’s a pity if you consider that during this same time, they get about a 100% of resources and motivation to reach that goal. Once this period is over, the resources and motivation drop drastically. Suddenly they are required to enter the ranks of the workforces and make a living, start to think about their own home, get married, get kids. The luxury student-live where everything is - maybe rather can be - focalized around their studies is abruptly ended and the motivation is reduced to classic best-effort and good-enough scales.
So coming at this point where their efforts are in fact focused 100%, the resources and motivation are now reduced, creating once again an imbalance, extending the time to reach real mastership once more.
Also on a practical scale, higher education tends to miss out on opportunities to really support their students. Too often projects are submitted and graded. Full stop. Hardly any feedback is given. So I got a 14 out of 20 on that submitted work. Now what I did to get that 14 is absolutely irrelevant, because that part was ok. But what did I do wrong to loose 6 points? There the actually learning experience lies. Things sometimes get even worse: take a test and do two projects during the semester, receive no feedback and get related questions at the exam. You just might have made the same mistake twice, without even knowing it.
As I learned once more from my psychology course, education or training in general is far more effective in a one-on-one situation, with direct feedback. During this same semester I also took on robotics. This course completely follows this idea. Students are required to propose a project and schedule 4 to 5 meetings with the professor during the semester where they talk about their project and need to show progress at every meeting. At first I found this very weird - I even started to rebel against it - but now, after thinking about it for some more time, I believe this is actually how it should be done.
First of all, it allows students to explore their interests within the domain of the course. This in itself is an enormous motivator, puts the course directly into perspective and connects to the actual knowledge of the student. As an expert, the professor can guide the students towards goals that can validate their new-found knowledge. But the benefit goes both ways. The professor gets access to a rather large group of unbiased people that go and look for interesting new ideas, process papers that might introduce him to material he simply can’t process due to his own resource constraints.
So let’s start from this interaction and expand this idea. Let’s say that we change the way higher education is structured and make it an open (minimal) five-year period where students pursue two tracks: 50% of their time (let’s say literally half a day) they join an actual company and work there like any other part-time employee. The remaining half day is dedicated to taking courses - in the way described above, in a one-on-one interactive relationship with the professor of their selected courses.
Selected courses … let’s also revisit the idea of a curriculum and allow students to choose their courses - not all of them, I agree that some should still be mandatory. Let them organize these courses in a way that fits them and spread them over these five years. At the end of these five years, they should have accumulated a predefined amount of credits to actually graduate.
An important side-note here is that they don’t take on one specific curriculum. During these five years they possibly discover that their actual interest lies somewhere else - maybe they encountered an interesting company or did a course suddenly ring a different bell, or simply they had some time to explore and now start to zoom in on one specific topic of interest. The only requirement is that at the end of their studies, they should have accumulated enough credits to comply with the minimal regulations required by the discipline they want to graduate in at that time.
As I see it, this form of education would be a win-win-win situation, where both the student, the industry and the higher educational institutes would benefit.
Students would be able to postpone their actual decision and have more time to actually experience their talents and interests. Also they would be able to learn the basic corporate ways and be much better prepared for what’s to come, allowing them to contribute their potential to the fullest.
Companies would of course benefit from this situation in two ways: first of all, they get access to an incredible workforce at very low cost. During the five years of part-time internship, the nowadays cost of training a junior would be amortized. Secondly, companies finally hiring graduates would no longer have to wait for five years to see these juniors become productive. They would even see a much improved intake process, where the graduate actually knows the company or at least has enough experience to know if this is their primary interest. Job-hopping rates would probably be a lot lower as would the fear to loose someone after the initial period.
Finally, higher education would benefit from this very tight integration with the actual industry. Seeing students going back and forth between workplace and class inherently brings back enormous amounts of real-world situations and problems, that can be integrated into the courses. Professors would get first-hand feedback about application of their courses in real-world situations. This way, problems encountered at work, could be discussed in class and perhaps new and great ideas could emerge, resulting in another win for the industrial partners, who now get an entire research group with the employed student.
Now, to avoid the obvious, I need to add one special case: the academic world itself also needs researchers and they often need to be trained in science that isn’t readily applied yet in the industry. The solution is on the other hand also obvious: the universities themselves can also be on the employers’ side here and take on students, allowing them to work in research-oriented projects for the same periods. Allowing future doctoral students to work alongside current researchers as soon as their first year, allows them to grow into this world, acquiring the skills needed to write papers and conduct research in general, while receiving direct feedback.
Maybe I’m knocking at an open door. Maybe there are valid reasons why higher education hasn’t been able to go down this road. But at this point I don’t see any valid reason why this shouldn’t be a path to pursue. It is done in other education systems, resulting in pretty good craftsmen leaving school.
Now I don’t want to leave on a negative note and I have to give at least two examples of efforts that try to implement these ideas: Academics For Companies is a non-profit organization, that aims to incorporate actual experience in students’ educational carrier. It’s still a voluntary effort, but the ideas are a big step in the right direction. If this effort would be tightly integrated into the entire curriculum, this could probably come close to the vision depicted above.
Next I have to stress that the flexibility to choose courses already exists to great extend at my faculty of computer sciences. After selecting one of the seven master disciplines, my master curriculum consists of 10 mandatory courses and 10 courses I could select myself - given some discipline-related pooling constraints. It may not yet be the long-term flexibility I’m after, but it does allow students to fill out their curriculum themselves to reasonable extend already. I can only hope that this will be expanded both in quantity and span over the entire education.
Maybe, I have to let history take its course and don’t expect a revolution. Maybe this is actually the beginning of where we need to get.
I really hope so.