followed tags: 32
followed domains: 0
badges given: 0 of 0
member for: 1448 days
You just described the ideal use case for LaTeX!
For really long documents it's especially good, because you can compile your chapters/sections/etc individually, if you set it up correctly. That way you only work with small documents most of the time, and don't kill your memory.
Of course, LaTeX is a markup language, so you'll have to learn a new language to use it, but if you can get over that learning curve it is so worth it.
While very few people in physics use Java for anything serious, it is still a good idea to learn the fundamental concepts behind Java and object-oriented programming. Learning different paradigms of programming will make programming more natural, and OOP is the most widely used paradigm today (though it's rather lacking in the sciences).
On the other hand, if you're already proficient in multiple languages, intro java classes might be completely below you, and you'll be extremely bored. You have to really want to learn the concepts, not just the execution, for it to be interesting.
It wasn't entirely an act of faith, to say that GW's existed. There is a binary neutron star system discovered decades ago, whose orbit has been decaying exactly as expected by gravitational radiation. That was an indirect detection, though, and therefore much less reliable than what we have now.
This is practical. We've never seen black holes collide, ever. This is also gives us a powerful way to test general relativity in extreme gravity.
LIGO is limited to a certain frequency range, which I believe includes: stellar-mass black hole mergers (like this one), neutron star mergers, neutron star - black hole mergers, and (maybe) spinning neutron stars. There is also the Pulsar Timing Array that's been running for some time, that has its own frequency range. Finally there's the proposed eLISA observatory, which would be put in orbit of the Sun, and be able to measure the most massive black hole mergers, which we expect to happen some time after 2 galaxies merge. This is actually a key point to modern cosmology, which we have yet to observe happening. Our best evidence is "yup, don't see many BH pairs at the centers of galaxies, they must've merged already".
However, don't expect to use GW's to do most traditional astronomy tasks. They require two+ massive objects orbiting each other, or one asymmetric object spinning, so anything outside of that is off the table. There is the possibility, for some events, to observe both in GW's and with light, which would let us determine more about the system. Gamma ray bursts are a good example, though they're so short it's hard to point a traditional telescope in time.
Doomsday predictions from a site called "world war 3 is coming .com"? Who woulda thought?
Seems this story has been spread earlier this year as well.
Edit: misread your post, guess we're both saying the same thing
It's an interpretation of the observation that, at the quantum scale (really really small), nature appears to be intrinsically uncertain. Even if we knew as much as possible about a system (there's actually a limit), we would only be able to say how probable each possible outcome is. We cannot predict one certain outcome.
This is in stark contrast to objects at the scale of our everyday lives. Sure, some things are just too complicated to keep track of (like grains of sand in a sandstorm). Let's instead isolate things to, say, two billiard balls. If we precisely measured the positions of the balls, their velocities, the friction of the table, etc, we could very accurately predict the outcome. If we reset the balls, and repeated the experiment, we would get almost the same results.
If we then replaced the billiard balls with atoms, things would be very different. Even reproducing the starting conditions exactly, your outcome would not be the same every time. This oddity has given rise to many contradicting interpretations, none of which can really be taken as fact.
Probably the most commonly accepted interpretations is the Copenhagen interpretation, which says that things really are random, the Universe just works that way. The most comforting interpretation is what I think you are referring to, called the hidden variable interpretation, which basically says the Universe isn't random ("God does not play dice"), and we're just missing something. Another interpretation, which is perhaps both exciting and terrifying, is the many worlds hypothesis. It says that things aren't random: every single possibility happens all at once, just in separate Universes.
The wise thing for now is to take all of these with a grain of salt.
Would we have modern machinery, and space travel without Newton's laws?
Modern electronics without the discoveries of Faraday, Maxwell, and others?
Computers without quantum mechanics?
GPS sattelites without general relativity?
Unforeseen world changing invention X without the discovery of the Higgs boson?
If you thought the answer to any of these was yes, then you probably took this article seriously. While he makes a valid point that technology will continue to grow in the absence of scientific input, it will be severely limited. Science and technology create a positive feedback loop with each other, and if one ceases to grow, the other will follow suit after some time.
Note that I said UNIX/Linux, not Linux. Mac OS X is certified UNIX (though I just checked and apparently only v10.5-v10.10 are actually certified, maybe it takes time to get certified?)
The UNIX architecture is pretty damn secure, while not foolproof.
Also, I've anecdotally heard about people installing Linux for their grandparents, and them liking it equally/more than Windows. I installed Xubuntu on my little sister's laptop which she did just fine with, and she's a "normal" computer user.
Notice how many of the problems you listed were specifically about Microsoft and Windows. If everybody used UNIX/Linux then those problems would mostly go away.