False beliefs and wishful thinking about the human experience are common. They are hurting people — and holding back science.
Just helped my girlfriend through a semester of non-calculus E&M. She said, "I've noticed that the average power in an AC circuit is 1/2 of what it is for a DC circuit when you have the same voltage and current. Why?". And so I tell her "Because the integral of sin^2(x)dx is 1/2 over n-number of cycles. Now, if you want to know why that's true... well, do you have 5 spare hours before your final in the morning?". So non-calculus physics just revolves around memorizing formulas, which is no way to truly comprehend the concepts of physics. In fact, a disdain of route memorization attracted me to physics in the first place.
I think about that ranking on average every 3 to 4 months, and usually agree. Edit: ...that it's written by someone who likes math and physics like myself. Oh and also, my sister has a degree in sociology, and she's accomplished some things that put me to shame.
I took physics for the first time my junior year of high school, at one of the best physics programs in the entire world for that level, according to past test scores. The first thing the teacher did was ask us if we knew calculus. Literally, he read us a bit of the syllabus and then began teaching integrals and derivatives. Junior year was also the first year we were allowed to take calc, so it was concurrent, but he knew it would take the math teachers a month to introduce us to our first derivative, and he refused to teach physics without derivatives. He was and is the best hard science teacher I or anyone I know has ever had. So I learned calc from graphs of distance, velocity and acceleration.
Wholeheartedly agree with that one. But I can't decide if it's time to evolve the teaching style a bit. By and large, the current format still consists of timed testing at a desk with only a pencil and paper. I've gotten serious test anxiety from this method, and it corresponds to few (if any) work scenarios after receiving a diploma. On the other hand, knowing that I have to do physics with a gun to my head sure does motivate me to hit the books. I have been seeing more open book and take-home tests as I've progressed through my education though. But the level of the curriculum has to be sufficiently advanced such that the answers aren't google-able. Had one prof this semester give an exam that was impossible to finish more than 60% of even if you knew the curriculum backwards and forwards. Yeah, the grade distribution will be re-scaled, but I've already been struggling with feelings of inadequacy. I'm actually under the impression there is a national culture of inducing existential crises within grad students.The best predictor for success in physics is still how much math knowledge a student has.
I used to think along the same lines about computing and starting at the lowest level, but then I started meeting people coming out of computer science programs who had never designed a CPU, programmed in assembly language, written a compiler or basic operating system or even programmed in C and implemented basic data structures themselves, and it's like everything below the level of what their programming language of choice and its runtime expose is magical to them. Now I think it's better to make students get their hands dirty, even if most of them will never get to do any of that after they graduate.
Why do you expect students who are essentially going through a math program to be able to do that? now that is reason for concern.I started meeting people coming out of computer science programs who had never designed a CPU, programmed in assembly language, written a compiler or basic operating system or even programmed in C
implemented basic data structures themselves, and it's like everything below the level of what their programming language of choice and its runtime expose is magical to them
Yeah, I've read Dijkstra too. He was brilliant and his aloofness was admirable but he still wrote programs. The great thing about computing is that the machine is the best teacher anyone could ask for. Sure, everything we do is applied logic and combinatorics, but we can render our objects of study concrete and poke at them, we can see where our understanding is flawed because the machine throws it in our faces. That is what is distinct about computing, that we get to work halfway between the ideal and the real. If you don't want to do that then you're better off studying logic in the philosophy department or combinatorics in the math department.Why do you expect students who are essentially going through a math program to be able to do that?
Well, that's what I'm doing. It's pretty fun. Anyway, I think what you're doing is claiming that being "close to the metal" is somehow purer. This isn't really the case. I've done plenty of work in assembler, so I know that it's messy, prone to human error, and boring as hell. The only reason anyone should ever program in assembly rather than C is that they are making an application that needs to be extremely high-performance, or that a decent C compiler isn't available for their platform. Designing a CPU is only tangentially related to programming, it's more of an engineering problem than a programming one. If you mean understanding a CPU, that's probably worthwhile, but actually making one isn't something I'd ever expect a CS or SE grad to know how to do. I will agree with you that CS grads should know the realities of how their interpreters work, how to write C/C++, and how to make data structures. But CS is a very wide field, even if you take the view that it's just math, and you can't expect new graduates to have experience in the lowest levels of computers. It's like telling me I must be a shitty math student because I don't derive everything from first principles, or telling a bio grad they must have been in a terrible program because they don't know how to synthesize proteins. There's so much more than the lowest levels, and all of it is interesting and all of it is worthwhile, and there's only so much a single person can really be good at.If you don't want to do that then you're better off studying logic in the philosophy department or combinatorics in the math department.
Designing a basic CPU was a week long project at the end of my first undergraduate logic design class. Logic design is computer science as much as it is engineering; below that, when you start to talk about the physical implementation of your gates, is another matter. Logic gates are a model of computation, just like Turing machines, the lambda calculus and your favorite programming language. At that level designing a CPU is just a funny way of writing an interpreter. You're imagining it to be harder than it is because you've never gotten down that far, but it's turtles all the way down (and all the way up). Only paying attention to the high level is a mistake; you miss out on the unity, and the low level looks magical to you. Only paying attention to the low-level stuff is also a mistake; you miss out on the high level and more abstract models look magical to you. Learn category theory and logic design, artificial intelligence and operating systems, the theory of computation and computer architecture. And actually write programs, because you can fake it to your professors but you can't fake it to your computer, and because you're missing out on the fun part if you don't.
So what do they end up doing? I worked in industry for a while, and it was all project work. It's largely project work in academia as well, and sometimes the students suffer from tenured professors that were hired for their research skills exclusively, but that's another conversation. Having project work as part of the curriculum should absolutely be more widely adopted. Most engineering programs have them, as far as I know, and it seems weird that computer science wouldn't. Edit: Actually, now that you mention it, I never had a project in undergrad either. The only fun I really had this semester was a project that I had to code for. Ran Fortran (77, lol) code and manipulated the outputs in IDL. Have to learn Python next semester. :)even if most of them will never get to do any of that after they graduate.
I have no idea what they're doing in their classes. In industry, most software projects are technically trivial. If you have a computer science degree from a school that doesn't advertise on TV you're probably overqualified for most programming jobs.
You should know what your library is doing, and if you can't implement an algorithm you don't understand it. But yes, rarely are you required to implement basic data structures in applications. Knowing enough about computation to find the structure in the logorrhea your users give you and implement that rather than all the cases they think are special as they describe them can save you a lot of trouble. In my experience if your education was as a software engineer your solution to that problem is to pile layers of architecture on top of it to sweep the unnecessary complexity under the rug rather the eliminate it. That way madness, misery and lucrative consulting contracts lie.
That's sad, but believable. My previous employer had this one young guy who wrote some incredible code for a calibration system. Totally custom, one-of-a-kind programs that did some very technical things. I had the pleasure of using the system. He wasn't around anymore because he had already left to go get payed more of what he was worth, and now he's a legend amongst many of my ex-coworkers. In general, I think if a programmer is talented and ambitious enough, they can rise through the ranks, or seek other opportunities. But luck is a thing, too. These overly simple programming jobs will someday face an elimination similar to the type we are going to see with truckers, cab drivers, etc. when driver-less cars are widely adopted in the nearer future. Machine-learning algorithms, and eventually true AI, will replace low-level programmers and (probably) quickly progress to skillsets beyond that of humans, and then even large teams of humans. So to some extent I do have a fear of AI, but I think that's healthy. Since no one's asking, I won't give an estimate for the dates of any of my predictions, but they're not as soon as /r/fyoochurology's. P.S. this is not my domain so feel free to school me, that's what Hubski's best for. P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?
One of the aphorisms you'll hear a lot of if you look into machine learning beyond what the futurologists bleet is "learning requires bias." What it's saying is that you need to build an assumption about the type of function being learned and what constitutes having learned it into a learning algorithm; you can't just say "learn to recognize faces from this set of examples", you have to say "learn to recognize faces by finding a hyperplane that divides the space of n-dimensional vectors into images of faces and non-images-of-faces by maximizing the distance between the set of faces and non-faces in this set of examples." There is a reason for that, and it applies to all optimization problems. We cannot now, nor will we ever, be able to write programs that can write any sort of program without them really being (possibly very clever) compilers or interpreters. You can find a lot of old papers talking about "automatic programming," but what they're really talking about is compilers; to people used to all programming being done in machine language, a compiler looked like giving it was getting a specification and writing a program in much the same was as programmers get specifications and write programs. That is not to say that we can't use AI in programming language implementations. For example, Postgresql's query planner uses a genetic algorithm, essentially using AI to write an efficient program to satisfy a query. Code generators in compilers often use techniques from AI as well. AI, or algorithms that started out in AI, are everywhere. Your search results, recommendations from Amazon and Youtube, the spam filters that keep you inbox from filling up with penis enlargement pumps and horny teenagers... You interact with AI all the time, you just don't notice because AI the actually existing set of technologies is much more mundane than AI the science fiction futurologists like to breathlessly write about being just around the corner. Yes, requiring AI be isolated from the Internet would be overly paranoid, and you would loose a lot of features of the sites you use regularly.P.P.S. is it paranoia (besides being obviously futile) to require by law that all AI developments are conducted on networks entirely isolated from the internet?
It's been more than a week (sorry), but I'm gonna deem this post as the #science #tagoftheweek winner, based on it being the fully-circle-dotted post with the most comments. Mindwolf, I think that means that you get to pick the next #tagoftheweek. [edit to try to reinstate the shoutout]
I think the most interesting part of the article is the end:Psychological studies suggest that the very act of attempting to dispel a myth leads to stronger attachment to it. In one experiment, exposure to pro-vaccination messages reduced parents' intention to vaccinate their children in the United States. In another, correcting misleading claims from politicians increased false beliefs among those who already held them. “Myths are almost impossible to eradicate,” says Kirschner. “The more you disprove it, often the more hard core it becomes.”