I just spent a fair bit of time on this question last week and here is my new setup. (Previously I used a mixture of mendeley with goodnotes and an Ipad pro with apple pencil).
Currently what I have moved to is using zotero on the desktop as the master store. This works well with the chrome and firefox plugins to auto add papers.
I use feedly to monitor RSS feeds for each journal from my phone, and then pull them up on the desktop once every so often to add to zotero. On the ipad pro with apple pencil I use papership to interface with the zotero library. It's annotations are usually good enough, but if I really need to do something fancy, I'll export it to pdfexpert, annotate it, then move it back in.
I disagree. Yes most of what each person says will not apply in any given case, but discussions like this where everyone is just kind of throwing it at the wall can be useful to try and pick out large trends. Plus as people discuss specifics of their workflow, individual threads can become more interesting.
Also the more esoteric workflows can be interesting to hear people describe to put our own systems into context.
While there are some very real issues at Hanford, the issue this week has been vastly blown out of proportion in the media (I work fairly near to the the location). There is a massive amount of work that has gone into detection of any kind of nuclear leak around here.
This seems to be universal with anything involving radioactivity.
The local media (aka, seattle) seems to have gotten the level of actual danger correct.. it was a 30 second blurb on NPR with a statement indicating no radioactivity had been released.
The media outrage is because the federal government has failed to clean up a project started ~30 years ago. Every time something bad happens (tunnel collapse, workers speak out when the federal gov't shafts them on health issues) it's like re-opening the wound.
I mean, how many times has Washington sued the federal government over this thing? Two? Three?
While this particular issue may not be much of a change, the overall situtaion still seems like one worth drawing attention to at any opportunity to encourage the federal government to stop putting off cleaning it up.
I think it may depend a lot upon the candidate. I personally feel like I make a pretty good salary, and with many jobs I would otherwise consider applying to, I don't feel like it's worth the time to go through the interview process just to find out their max salary they can offer is a pay cut for me. Similarly if there is a job that looks interesting to me, but advertises it's range and it's in the ballpark of what I want, I would be MUCH more likely to apply.
It also provides a lot of information about what the job will be expected to do. With what I do, people are not going to be willing to give me the salary I want, while also wasting my time on remedial work when they could get someone much cheaper that could do those tasks. So by keeping my interest in higher salary jobs, I can try to limit it to jobs which hold more interest for me personally.
If you google matplotlib viridis there is a discussion of the issue. Using perceptually uniform colormaps(not to mention ones that translate to gray scale correctly) is normally the correct thing to do, but not always. Sometimes you want to highlight particular conditions. For instance, jet with radar data tends to nicely separate different atmospheric conditions if set up right. The same perceptual colormap version feels somehow lacking.
When you see an english tv program in Sweden or Finnish is often in english with subtitles in the country language. South European countries instead normally translate the program to the language of the country. Is as simple as this. Each system have its own advantages and handicaps.
This is interesting, but not as groundbreaking as they make it out to be. People are already using GPS signals to calculate things like available liquid water. For instance, see http://www.suominet.ucar.edu/ .
One of the main problems with something like this is that you get an integrated quantity, with no easy way of backing out the exact elevation this occured at. If you get a dense enough network, sure you may be able to work it back out, but you also don't really get a sense of the drop shape.
For comparison, the most advanced satellite weather system we have is the Global Precipitation Measurement(GPM) constellation. This uses a series of radiometers and radars to provide a much more detailed picture, that also includes 3d data. See http://www.nasa.gov/mission_pages/GPM/main/ for more information.
If you stick to ground radar, you can get even more impressive by using polarization. By transmitting dual polarized signals, we can actually get a measure of the shape of the raindrops(based on the difference in power returned by the two polarizations). The US NEXRAD network recently upgraded it's radars to be dual polarization capable(finally). This also provides us with the ability to differentiate between ice and rain(Something we couldn't, strictly speaking, do before). We can also separate out the signals caused by bugs(which can be a huge issue), as well as other sources of signal contamination.
Maybe not the most sophisticated of pages, but sometimes it is good to get a brief reminder of what might be asked, especially for those of us that are not considered senior yet.