followed tags: 0
followed domains: 0
badges given: 0 of 0
member for: 2331 days
This is similar to what I do.
I use a Mac only because I am expected to do things with Office (OpenOffice/LibreOffice just doesn't cut it) and Outlook is great. Though, I do all of my development/research on a Linux VM or Linux cluster. I only have OSX because of Office.
I would drop OSX in a heartbeat if I could have Office on Linux. Though, it has to have a flawless Outlook, too.
This seems to be cynical job descriptions...
If I play along with the cynicism: Researching methods for visualizing and analyzing large-scale scientific data that will never get used because "real scientists" don't trust computer scientists (computer science is easy) and will just write their own shitty, serial, ad-hoc code, anyways.
If I go with my "honest" job description, just truncate it: Researching methods for visualizing and analyzing large-scale scientific data.
I don't know if the author is claiming on inventing something new, but using springs for graph/label placement has been used for a while in the infovis community. There's a lot of other graph placement algorithms if you are interested.
Same here, especially with something like git.
No one has to see my changes until I push them, and if I really care that I make too many local commits (I don't) could clean them up with git rebase.
My favorite is the "standard" VESA font on most PCs when they boot and the linux terminal before you go to runlevel 6. I'm sure there are slight differences between graphics cards, but they all look similar to me.
I've ripped it before and converted it to a TTF for use in gnome-terminal. Right now I'm just using one of the standard monospace available in X.
There's several examples that I know of that were the one-off prototype code written by the physicists that turned into production code. They were used for years but would take hours to run.
We look at it and fix it for them. Usually takes less than a week. Now, the codes now run in minutes instead of hours. That's a lot of wasted scientist time.
Though, I get that it's hard to get software engineers or computer scientists on the grants, anyways, because no one wants to pay for them, even if there is a large computational portion of the research.
At any rate, there's a lot of crappy code written in all of the sciences. It's not just bioinformatics.
My personal feeling is there is hubris across many of the sciences that writing good code is "easy." I've offended many physicists when I have suggested that they might want a computer scientist or software engineer help write their code.
I am a computer scientist. While many people wouldn't consider it a "science," (I doubt it most of the time, theoretical CS, IMNSHO, is just applied math) I help a lot of fundamental basic sciences (non-CS) research.
My specialization is high performance (supercomputers) computing and scientific visualization and analysis. This translates to helping scientists run simulations, large-scale data wrangling, and helping to analyze scientific and simulation data. I'm more on the application side, rather than hardware side. So, I act as a facilitator to the domain scientists and the guys building the supercomputers, because I talk both "languages," or at least I try to.
I do like it, and as I mentioned before, the only part that isn't fun is writing papers (but that needs to be done so others can see your research) and getting funding. I'm not completely all "soft money," so I do have some stable money that I can rely on. Also, I'm currently overfunded, so I am doing just fine in terms of paying myself.
I tend to agree with most other scientists that there isn't enough funding. There are scientists at my lab that are unfunded and go onto overhead.
Though, I'm not sure the solution is to guarantee funding for everybody. There's an incentive to do well on each soft money research project to show that you are competent and can successfully complete the next research grant. So, I speculate that we might not be as productive, but that's probably wrong.
More likely, I would be just as productive with steady funding and it would be beneficial to the whole society to guarantee funding for research, rather than having to constantly sing for your dinner. So much research time is spent just looking for the next dollar. Most people like being useful or productive, even if funding was guaranteed, and there would likely be more scientists if research funding was steady.
We use a lot of software rendering (Mesa OpenGL or Manta raytracer) - and for our purposes, large models with more primitives than screen pixels - raytracing is faster (for us) for the O(pixels) rather than O(primitives). Plus, the multicore CPU handles large models better than a GPU.
Of course, there are lots of caveats and different use cases (gaming is totally different from our needs). For example, we still use a lot of Mesa OpenGL anyways, just because it's easier to use most of the time than a raytracer.