C. Titus Brown wrote an informative post on using his digital normalization (khmer) with Trinity transcriptome assembler. It compares khmer with diginorm- replica code written by Trinity and finds that Trinity’s code does ‘better’ filtering. How?
I now understand why the Trinity algorithm discards so much more data than digital normalization: it uses a pretty hard-core heuristic guess about what relative k-mer abundances within a read should look like, and discards reads that look bad. We are already doing this with diginorm implicitly by using the median, but this is way more stringent. I’m still not sure how much this added stringency will matter for things like sensitivity to splice junctions. That, however, is something I’ll leave for future inquiry… because I’m done for tonight ;).
‘Better’ is in quote, because his must-read post has lot more than what the above paragraph covers. Hopefully, he will get angry reading this inaccurate summary, and provide us with a better plug :)
2. [A second-generation assembly of the Drosophila simulans genome provides new insights into patterns of lineage-specific divergence
Genome Research Paper
We create a new assembly of the Drosophila simulans genome using 142 million paired short-read sequences and previously published data for strain w501. Our assembly represents a higher-quality genomic sequence with greater coverage, fewer misassemblies, and, by several indexes, fewer sequence errors. Evolutionary analysis of this genome reference sequence reveals interesting patterns of lineage-specific divergence that are different from those previously reported. Specifically, we find that Drosophila melanogaster evolves faster than D. simulans at all annotated classes of sites, including putatively neutrally evolving sites found in minimal introns. While this may be partly explained by a higher mutation rate in D. melanogaster, we also find significant heterogeneity in rates of evolution across classes of sites, consistent with historical differences in the effective population size for the two species. Also contrary to previous findings, we find that the X chromosome is evolving significantly faster than autosomes for nonsynonymous and most noncoding DNA sites and significantly slower for synonymous sites. The absence of a X/A difference for putatively neutral sites and the robustness of the pattern to Gene Ontology and sex-biased expression suggest that partly recessive beneficial mutations may comprise a substantial fraction of noncoding DNA divergence observed between species. Our results have more general implications for the interpretation of evolutionary analyses of genomes of different quality.
Casey Bergman is angry again, because he can’t solve an ethical dilemma.
Is it ethical to suggest reviewers for a journal submission based on tweets about your arXiv preprint?
We believe it is far more ethical to suggest reviewers based on positive tweets than to use pseudo-scientific reasons to ask for increase of NIH, rain or shine, because
a) The journal is not going to decide its list of reviewers solely based on author’s suggestions.
b) The suggested reviewer may refuse to review, if he thinks he cannot provide objective reviews because of his prior interactions.
c) The suggested reviewer may still reject the paper, because ‘it is interesting as a blog post, but does not merit publication in Genome Research’.
An informative seqanswers thread on the hottest sequencing product that does not exist.
Funny remark -
Did Nate Silver produce a better prediction of the election than the pundits because he had better models or better technology? No, its because he bothered to use data at all.
The article discusses how incorporating the laws of thermodynamics can reorient our thinking about economic process.
The four laws of thermodynamics are neatly summarised in a simple ditty: 0th: You must play the game; 1st: You cant win; 2nd: You cant break even; 3rd: You cant leave the game.
Steve Keen mentioned in the article is undoubtedly among the best economists in the world, IMHO. He is also the most inconvenient academic to the mainstream gangs, having written a book titled ‘Debunking Economics’.
Against that backdrop, it’s not terribly surprising to hear that Steve Keen and his entire economics department at the University of Western Sydney (UWS) are under threat of extinction. The Australian government has used a nice trick to achieve this. Under the guise of creating more competition, it destroys it. Anyone who can fog a mirror can now apply to the “top” universities in Australia, no matter what their grades are coming in. This is the sort of thing that ostensibly aims for fairness, in the same way that globalization and privatization do. All hail the lowest common denominator. The result is that everyone applies at the top uni’s, and only there.
Since the University of Western Sydney, where Steve teaches, has never been promoted to the top (though its economics programs may be far better than the others’), nobody applies for its programs anymore. And though this is an entirely new situation, the government has already proposed simply closing down the economics department at UWS. Even though the situation is volatile, and many students who wont get into the top schools will likely come to UWS later.
Old post - C. Titus Brown explains the use of having an automated test bench in software development. Also check his latest post on Software Carpentry at Scripps.
Speaking of things that just take time: don’t bother trying to teach people who don’t have any programming experience to program in a workshop! It takes weeks or months to do that. If they know some Perl or Ruby or Matlab, then I bet that you can usefully throw some Python at them.
That excludes us. We can program Perl, Ruby and Matlab, but absolutely refuse to even ‘try’ to ‘catch’ any Python ‘thrown’ at us :)
Here are two interesting tidbits about the races to build tallest building.
The building, between Nassau Street and William Street in Manhattan, New York City, was completed in 1930 after only 11 months of construction.
Construction of the Bank of Manhattan Building at 40 Wall Street began in 1928, with a planned height of 840 feet (260 m), making it 135 feet (41 m) taller than the nearby Woolworth Building, completed in 1913. More importantly, the plans were designed to be two feet taller than the Chrysler Building, which was in an ostensible competition to be the worlds tallest building.
Uptown at 405 Lexington Avenue, the Chrysler Building developers were in the works to top 40 Wall Street. By October 1929, tycoon Walter Chrysler used his secret weapon to win the race to the top; a 125-foot (38 m) stainless steel spire was clandestinely assembled in the Chrysler Building’s crown and hoisted into place, bringing it to a height of 77 stories, or 1,046 feet (319 m).
Such trivialities became a moot point when the Empire State Building was completed eleven months later in 1931, becoming the worlds tallest building at 1,250 feet (380 m).
Bottom line - In previous cycle (see chart), race to build the tallest structure was centered in USA, a country considered to be boorish by economically, scientifically and culturally well-advanced Germans of that era.
Also remember the skyscraper index created by Andrew Lawrence in 1999.
In the historical context, it is interesting that all comments in the China story are saying that the building will not last until the next major storm.
One of the first skyscrapers was designed and built by Bradford Lee Gilbert in 1887. …………His 160-foot structure was ridiculed in the press, with journalists hypothesizing that it might fall over in a strong wind. Friends, lawyers and even structural engineers firmly discouraged the idea, warning that if the building did fall over, the legal bills alone would ruin him. To overcome the skepticism of both the press and his advisors, Gilbert took the top two floors for his personal offices.