One of my friends is in College, and is currently feeling the full idiocy of a system that was only beginning to be rolled out as I left. Let me explain how it works.
Essentially, the system is meant to test the students’ solutions to homework problems. This is done by providing a solid definition of what the input and output of the application are supposed to be on the standard in/out channels, and setting up a whole bunch of test cases, including a memory limit and a CPU time limit. Students submit their source code to the system, which compiles it and runs all the test cases against the application in a black box test. So far, so good.
Seeing these guys at work, compared to me and my colleagues at work, makes a few things very apparent: even with a fairly solid grasp of algorithms and datastructure, their number one problem is code. Where professional programmers swim through code like sharks in the sea, the students appear to be more or less drowning. Theoretical learning aside, the education lacks practical programming, debugging, practical programming and some more practical programming.
It would seem that these programming exercises would be the perfect opportunity to get that kind of experience, if it wasn’t for the fact that the test system is itself a black box. You put in your code, and it tells you yes or no. It’s not quite a boolean pass/fail answer, but close enough: you will get told a result from the set: Didn’t compile, Passed, Failed, Crashed, Time Limit Exceeded. When I first heard of the system, it was motivated with the fact that sometimes in professional programming, that’s all you get.
I agree. Sometimes, you get gnarly bugs that give you less information than a world pro’s poker face. I’ve spent weeks tracking bugs like that sometimes, using all kinds of tools at my disposal to try to wring more information out of the error, until finally the knot was untied. But — for all the bugs like that I’ve been through, none of them were eventually solved by guessing what was wrong and how to fix it.
Supposedly, the tool is meant to teach the students to debug their code… which it somehow does by disallowing all normal debugging tools. You can’t run a debugger on it, you can’t print traces, you’re not allowed to log to a file or socket, you’re not even allowed to know what input caused the error. The only tools you have at your disposal are your wit in coming up with your own test cases and code reviews.
Any attempts at normal debugging would be classified as cheating. If I was faced with a bug under those circumstances, I would do whatever I could to get more information out of it. Hey, I can crash it with different signals — that’s a few bits of information I could get back from it. All those kinds of tricks of the trade that real programmers use to, you know, solve problems… would be cheating.
This leads to a skewing of results… very simple bugs turn into monster problems, since you can’t identify and fix them. What they are learning is not how to debug their programs but how to painstakingly solve the very specific problem of pleasing the system. By artificially making easy things hard, the system has effectively found a way to avoid teaching the students essential skills in programming: simple debugging tools like tracing and breaking into a debugger. Instead, they learn programming by coincidence: poke something until you (hopefully, eventually) get a green light.
That’s not a lesson to learn.
The only way to go about this, faced by the obstacle made up of this system, is to learn a different skill: testing. More on that later.