So what do they end up doing? I worked in industry for a while, and it was all project work. It's largely project work in academia as well, and sometimes the students suffer from tenured professors that were hired for their research skills exclusively, but that's another conversation. Having project work as part of the curriculum should absolutely be more widely adopted. Most engineering programs have them, as far as I know, and it seems weird that computer science wouldn't. Edit: Actually, now that you mention it, I never had a project in undergrad either. The only fun I really had this semester was a project that I had to code for. Ran Fortran (77, lol) code and manipulated the outputs in IDL. Have to learn Python next semester. :)even if most of them will never get to do any of that after they graduate.
I have no idea what they're doing in their classes. In industry, most software projects are technically trivial. If you have a computer science degree from a school that doesn't advertise on TV you're probably overqualified for most programming jobs.
You should know what your library is doing, and if you can't implement an algorithm you don't understand it. But yes, rarely are you required to implement basic data structures in applications. Knowing enough about computation to find the structure in the logorrhea your users give you and implement that rather than all the cases they think are special as they describe them can save you a lot of trouble. In my experience if your education was as a software engineer your solution to that problem is to pile layers of architecture on top of it to sweep the unnecessary complexity under the rug rather the eliminate it. That way madness, misery and lucrative consulting contracts lie.
That's sad, but believable. My previous employer had this one young guy who wrote some incredible code for a calibration system. Totally custom, one-of-a-kind programs that did some very technical things. I had the pleasure of using the system. He wasn't around anymore because he had already left to go get payed more of what he was worth, and now he's a legend amongst many of my ex-coworkers. In general, I think if a programmer is talented and ambitious enough, they can rise through the ranks, or seek other opportunities. But luck is a thing, too. These overly simple programming jobs will someday face an elimination similar to the type we are going to see with truckers, cab drivers, etc. when driver-less cars are widely adopted in the nearer future. Machine-learning algorithms, and eventually true AI, will replace low-level programmers and (probably) quickly progress to skillsets beyond that of humans, and then even large teams of humans. So to some extent I do have a fear of AI, but I think that's healthy. Since no one's asking, I won't give an estimate for the dates of any of my predictions, but they're not as soon as /r/fyoochurology's. P.S. this is not my domain so feel free to school me, that's what Hubski's best for. P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?
One of the aphorisms you'll hear a lot of if you look into machine learning beyond what the futurologists bleet is "learning requires bias." What it's saying is that you need to build an assumption about the type of function being learned and what constitutes having learned it into a learning algorithm; you can't just say "learn to recognize faces from this set of examples", you have to say "learn to recognize faces by finding a hyperplane that divides the space of n-dimensional vectors into images of faces and non-images-of-faces by maximizing the distance between the set of faces and non-faces in this set of examples." There is a reason for that, and it applies to all optimization problems. We cannot now, nor will we ever, be able to write programs that can write any sort of program without them really being (possibly very clever) compilers or interpreters. You can find a lot of old papers talking about "automatic programming," but what they're really talking about is compilers; to people used to all programming being done in machine language, a compiler looked like giving it was getting a specification and writing a program in much the same was as programmers get specifications and write programs. That is not to say that we can't use AI in programming language implementations. For example, Postgresql's query planner uses a genetic algorithm, essentially using AI to write an efficient program to satisfy a query. Code generators in compilers often use techniques from AI as well. AI, or algorithms that started out in AI, are everywhere. Your search results, recommendations from Amazon and Youtube, the spam filters that keep you inbox from filling up with penis enlargement pumps and horny teenagers... You interact with AI all the time, you just don't notice because AI the actually existing set of technologies is much more mundane than AI the science fiction futurologists like to breathlessly write about being just around the corner. Yes, requiring AI be isolated from the Internet would be overly paranoid, and you would loose a lot of features of the sites you use regularly.P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?