Today’s 2Y auction results and the Historical Auction Statistics.Calendar doc have been updated here [GDocs].
Seth Klarman is one of my value investing idols. Last I checked, since inception in 1982, Baupost has returned almost 20% annually, with only 1 or 2 down years, an incredible track record. If you can get a copy of it, I strongly recommend getting your hands on a copy of Klarman’s book, “Margin of Safety,” even though there’s only about 900 or so copies in existence (more if you count digitized copies). Either way, whether you’re a fundamental, technical, and/or quantitative trader/investor, I strongly recommend watching this video from one of the greatest investors of all time.
A friend sent me this PDF a couple weeks ago and it got buried in the inbox. Given all of the attention to the most recent Japan quake, I found it an interesting read from 1989 by Michael Lewis (of Liar’s Poker fame).
In 2001, Peter Orszag and Joseph Stiglitz published a short paper wherein they concluded:
…if anything, tax increases on higher-income families are the least damaging mechanism for closing state fiscal deficits in the short run. Reductions in government spending on goods and services, or reductions in transfer payments to lower-income families, are likely to be more damaging to the economy in the short run than tax increases focused on higher-income families.
This is all fine and dandy in the ivory tower (or the White House, as it were to be less than a decade later), but in the realm of reality, theories only get us so far, no matter how seemingly reasonable their basis and underlying assumptions.
Today’s WSJ included an essay about the state of The State of California, specifically, its incredible reliance on the very-rich (top 1% +). Many if not most commentators/politicians/etc seem to think that taxing “the rich” at a (vastly) higher rate than everyone else – especially “the poor” – not only makes perfect sense, but is more than “fair,” given the increasingly high % of income accruing to “the rich.” This is all fine and dandy, except:
I’ve been experimenting with a number of “R” packages in my work for quite a few years now and recently have stepped up testing ideas. I ran into quite an interesting problem while using the blotter package the other week and the thought that popped into my head was: this is an “embarrasingly parallel” problem. Today’s “Geekfest” shows how to parallelize backtests from the blotter package using SNOW.
I’ve written about SNOW lightly in the past here, basically it’s a cluster of workstations that R uses to speed up the computing time. The previous performance enhancements were here (running VaR sensitivity and ARIMA forecasting) – today’s example will show you how to parallelize a simple trend following system based off of Mebane Faber’s work.
I was browsing the titles in Amazon’s Kindle marketplace for something interesting to read and ran across an interesting book by Richard Swenson, MD, titled: Margin: Restoring Emotional, Physical, Financial and Time Reserves to Overloaded Lives the other day and quickly became engrossed in this topic. We are all acutely aware that today’s modern society is moving at breakneck speeds, technology has not only made us more productive but has eliminated much of the “downtime” that we normally used to have in order to “repair” ourselves.
I clicked on a link from Twitter to a NYT article about Walmart’s colossal failure to break into the NYC market, fully expecting every hyperlink would point to some nyt.com page. Imagine, then, my complete surprise to find that not only did they link to an outside page, but to several of them!!!!!
Is this a new practice at the Times? If so, consider me a fan!