a thoughtful web.
Good ideas and conversation. No ads, no tracking.   Login or Take a Tour!
comment by user-inactivated
user-inactivated  ·  3256 days ago  ·  link  ·    ·  parent  ·  post: The science myths that will not die.

I used to think along the same lines about computing and starting at the lowest level, but then I started meeting people coming out of computer science programs who had never designed a CPU, programmed in assembly language, written a compiler or basic operating system or even programmed in C and implemented basic data structures themselves, and it's like everything below the level of what their programming language of choice and its runtime expose is magical to them. Now I think it's better to make students get their hands dirty, even if most of them will never get to do any of that after they graduate.





dingus  ·  3253 days ago  ·  link  ·  

    I started meeting people coming out of computer science programs who had never designed a CPU, programmed in assembly language, written a compiler or basic operating system or even programmed in C

Why do you expect students who are essentially going through a math program to be able to do that?

    implemented basic data structures themselves, and it's like everything below the level of what their programming language of choice and its runtime expose is magical to them

now that is reason for concern.

user-inactivated  ·  3253 days ago  ·  link  ·  

    Why do you expect students who are essentially going through a math program to be able to do that?

Yeah, I've read Dijkstra too. He was brilliant and his aloofness was admirable but he still wrote programs. The great thing about computing is that the machine is the best teacher anyone could ask for. Sure, everything we do is applied logic and combinatorics, but we can render our objects of study concrete and poke at them, we can see where our understanding is flawed because the machine throws it in our faces. That is what is distinct about computing, that we get to work halfway between the ideal and the real. If you don't want to do that then you're better off studying logic in the philosophy department or combinatorics in the math department.

dingus  ·  3252 days ago  ·  link  ·  

    If you don't want to do that then you're better off studying logic in the philosophy department or combinatorics in the math department.

Well, that's what I'm doing. It's pretty fun.

Anyway, I think what you're doing is claiming that being "close to the metal" is somehow purer. This isn't really the case. I've done plenty of work in assembler, so I know that it's messy, prone to human error, and boring as hell. The only reason anyone should ever program in assembly rather than C is that they are making an application that needs to be extremely high-performance, or that a decent C compiler isn't available for their platform. Designing a CPU is only tangentially related to programming, it's more of an engineering problem than a programming one. If you mean understanding a CPU, that's probably worthwhile, but actually making one isn't something I'd ever expect a CS or SE grad to know how to do.

I will agree with you that CS grads should know the realities of how their interpreters work, how to write C/C++, and how to make data structures. But CS is a very wide field, even if you take the view that it's just math, and you can't expect new graduates to have experience in the lowest levels of computers. It's like telling me I must be a shitty math student because I don't derive everything from first principles, or telling a bio grad they must have been in a terrible program because they don't know how to synthesize proteins. There's so much more than the lowest levels, and all of it is interesting and all of it is worthwhile, and there's only so much a single person can really be good at.

user-inactivated  ·  3251 days ago  ·  link  ·  

Designing a basic CPU was a week long project at the end of my first undergraduate logic design class. Logic design is computer science as much as it is engineering; below that, when you start to talk about the physical implementation of your gates, is another matter. Logic gates are a model of computation, just like Turing machines, the lambda calculus and your favorite programming language. At that level designing a CPU is just a funny way of writing an interpreter. You're imagining it to be harder than it is because you've never gotten down that far, but it's turtles all the way down (and all the way up). Only paying attention to the high level is a mistake; you miss out on the unity, and the low level looks magical to you. Only paying attention to the low-level stuff is also a mistake; you miss out on the high level and more abstract models look magical to you. Learn category theory and logic design, artificial intelligence and operating systems, the theory of computation and computer architecture. And actually write programs, because you can fake it to your professors but you can't fake it to your computer, and because you're missing out on the fun part if you don't.

am_Unition  ·  3256 days ago  ·  link  ·  

    even if most of them will never get to do any of that after they graduate.

So what do they end up doing? I worked in industry for a while, and it was all project work. It's largely project work in academia as well, and sometimes the students suffer from tenured professors that were hired for their research skills exclusively, but that's another conversation. Having project work as part of the curriculum should absolutely be more widely adopted. Most engineering programs have them, as far as I know, and it seems weird that computer science wouldn't. Edit: Actually, now that you mention it, I never had a project in undergrad either.

The only fun I really had this semester was a project that I had to code for. Ran Fortran (77, lol) code and manipulated the outputs in IDL. Have to learn Python next semester. :)

user-inactivated  ·  3256 days ago  ·  link  ·  

I have no idea what they're doing in their classes. In industry, most software projects are technically trivial. If you have a computer science degree from a school that doesn't advertise on TV you're probably overqualified for most programming jobs.

whanhee  ·  3253 days ago  ·  link  ·  
This comment has been deleted.
user-inactivated  ·  3253 days ago  ·  link  ·  

You should know what your library is doing, and if you can't implement an algorithm you don't understand it. But yes, rarely are you required to implement basic data structures in applications. Knowing enough about computation to find the structure in the logorrhea your users give you and implement that rather than all the cases they think are special as they describe them can save you a lot of trouble. In my experience if your education was as a software engineer your solution to that problem is to pile layers of architecture on top of it to sweep the unnecessary complexity under the rug rather the eliminate it. That way madness, misery and lucrative consulting contracts lie.

whanhee  ·  3251 days ago  ·  link  ·  
This comment has been deleted.
am_Unition  ·  3255 days ago  ·  link  ·  

That's sad, but believable.

My previous employer had this one young guy who wrote some incredible code for a calibration system. Totally custom, one-of-a-kind programs that did some very technical things. I had the pleasure of using the system. He wasn't around anymore because he had already left to go get payed more of what he was worth, and now he's a legend amongst many of my ex-coworkers. In general, I think if a programmer is talented and ambitious enough, they can rise through the ranks, or seek other opportunities. But luck is a thing, too.

These overly simple programming jobs will someday face an elimination similar to the type we are going to see with truckers, cab drivers, etc. when driver-less cars are widely adopted in the nearer future. Machine-learning algorithms, and eventually true AI, will replace low-level programmers and (probably) quickly progress to skillsets beyond that of humans, and then even large teams of humans.

So to some extent I do have a fear of AI, but I think that's healthy. Since no one's asking, I won't give an estimate for the dates of any of my predictions, but they're not as soon as /r/fyoochurology's.

P.S. this is not my domain so feel free to school me, that's what Hubski's best for.

P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?

user-inactivated  ·  3255 days ago  ·  link  ·  

One of the aphorisms you'll hear a lot of if you look into machine learning beyond what the futurologists bleet is "learning requires bias." What it's saying is that you need to build an assumption about the type of function being learned and what constitutes having learned it into a learning algorithm; you can't just say "learn to recognize faces from this set of examples", you have to say "learn to recognize faces by finding a hyperplane that divides the space of n-dimensional vectors into images of faces and non-images-of-faces by maximizing the distance between the set of faces and non-faces in this set of examples." There is a reason for that, and it applies to all optimization problems. We cannot now, nor will we ever, be able to write programs that can write any sort of program without them really being (possibly very clever) compilers or interpreters. You can find a lot of old papers talking about "automatic programming," but what they're really talking about is compilers; to people used to all programming being done in machine language, a compiler looked like giving it was getting a specification and writing a program in much the same was as programmers get specifications and write programs.

That is not to say that we can't use AI in programming language implementations. For example, Postgresql's query planner uses a genetic algorithm, essentially using AI to write an efficient program to satisfy a query. Code generators in compilers often use techniques from AI as well.

    P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?

AI, or algorithms that started out in AI, are everywhere. Your search results, recommendations from Amazon and Youtube, the spam filters that keep you inbox from filling up with penis enlargement pumps and horny teenagers... You interact with AI all the time, you just don't notice because AI the actually existing set of technologies is much more mundane than AI the science fiction futurologists like to breathlessly write about being just around the corner. Yes, requiring AI be isolated from the Internet would be overly paranoid, and you would loose a lot of features of the sites you use regularly.