Students in high school and college (10th, 12th, BSc) that take programming courses end up with programming projects as a part of the course. These projects merit between a fortnight and six months depending on the course and the level of study. The goal of these projects is to give students a taste of real world development. Invariably, students end up choosing something banal: another email client in Java, or a web server. While there is some benefit to writing those programs as educational experiences, I think in the frame of real-world projects, such programs do more harm than good.
For one, most real world programming is incremental. You have to add to existing code, or maintain existing code, and the opportunities to rewrite the system from scratch are rare. If there is a bug in the communication library, you cannot propose to rewrite the entire library: you are going to have to find the bug and fix it. You have to learn how to read existing code, perhaps in the absence of the original author to explain it to you. You're lucky if you find the code well documented. Writing another small program in the hopes that it is similar to a real project is delusion. This explains why so many new programmers are bad at finding bugs: they've never done it before. And if their system had a bug, they had the luxury to rewrite it. But teachers at that teaching level haven't worked at programming jobs earlier and so they don't know this. And students, having only read about blazing mavericks like Steve Wozniak and Linus Torvalds are too young to know it. Good programming is not just acquired by writing good programs, but also reading good ones: just like poetry or art.
Another problem is that students forget that programs exist to solve a problem. If a developer scratches an itch, it is because the itch is so damn annoying in the first place. Linus certainly felt so when he wrote the kernel. Most student projects end up being completely useless: even to the authors. This explains why the students are so unmotivated, and the final product is so unpalatable.
To be honest, all these issues are new, and I doubt very much thought has been given to it. And most people are only beginning to get aware about the wealth of good code available due to Free and Open Source Software.
I think a marvelous project would be to take existing software and modify it. You could track down and eliminate a particular Apache bug, for instance. That would teach a lot more about the working of a complex system. The code is all out there for the looking, so you don't have to ask anyone's permission. And when you fix a bug, nobody asks you how old you are, or which country you come from.
Or students could work on open source bounties, which are all bite-sized projects, focus on a particular problem that people need a solution for, and put the work to good use. It puts you in charge of real world programming, allows you to interact with real developers, and get a flavor for product development. And you get paid to do it!
Aside from getting recognition for your work, you get to see how important it is to collaborate with others, to see their point of view, and to defend your idea when you are correct. And your resume shines when companies see that you patched a severe bug in Apache (when you were only 16!).
This would be much harder than writing that make-believe web server, for sure. The hardest part would be knowing the existing codebase, understanding the domain, and learning how to solve a real problem. But once you know those, the programming itself wouldn't be very much different from the make-believe examples. Then again, the rewards are well worth the additional effort.
I did my fair share of the make-believe projects when I was young which is why I know all about them. If my teachers had the insight to point me to Sourceforge or Berlios, I would have learnt a few important lessons earlier.