I'm in Subaruland
Results of a count of all the cars in two rows of the parking lot in front of the REI store in Boulder, Colorado today:
- Total cars: 27 (100 percent)
- Subarus: 10 (37 percent)
- All others: 17 (63 percent)
Results of a count of all the cars in two rows of the parking lot in front of the REI store in Boulder, Colorado today:
The people who want to privatize Social Security say that it's a unsustainable Ponzi scheme, using payments from current workers to pay current retirees, and paying future retirees using contributions from future workers. They say that by allowing workers to invest part of their payroll contributions, President Bush's plan will aid Social Security by increasing the returns on that portion of contributions, enabling benefits to be lowered without any net loss to retirees. They say that raising taxes to fund future Social Security shortfalls is unacceptable. And they say that assuming a trillion dollars or more in debt to fund this partial transition wouldn't harm the economy, since the markets already assume that the government has obligated itself to pay out that money.
The people who are against privatizing Social Security say that it shouldn't be thought of as an investment vehicle, but rather as an expression of society's desire that no one be consigned to retire in poverty. They say that President Bush's plan does nothing to strengthen Social Security, since it reduces contributions into the system along with reducing payouts. They say that Social Security's future shortfall could be closed by raising taxes only the wealthiest few. And they say that borrowing money in the financial markets will have a radically different effect on the economy than simply promising to pay future retirees a certain amount.
The problem as I see it is that both sides are mostly right. Social Security is a Ponzi scheme. We're paying the price for past generations' desire to begin paying retirees immediately, instead of paying retirees only in step with their contributions. But at the same time, we don't want to see anyone live our their retirement in abject poverty. The President's plan could help by increasing returns, but there are no guarantees that it will do so, and no way of accurately projecting by how much, if at all. Social Security taxes are high already, and it seems fundamentally unfair to point to a single segment of society to pay more in order to benefit everyone else. And yes, we do owe the money to future retirees one way or another, but it would be a huge economic risk to actually go out and borrow that money in the markets.
Social Security is a defined-benefit plan, like private pensions: you pay in a certain amount of money, and you're guaranteed to receive a specified benefit upon retirement. It's useful to note that defined-benefit plans are going the way of the dodo because of the risks they pose to the companies who operate them (who may find themselves with staggering bills down the road) and to their would-be recipients (who could find themselves retired but with no benefit if their plan collapses).
American firms have been replacing defined-benefit plans with defined-contribution plans: you contribute to your own retirement account (usually with assistance from your employer), but how you manage that account and the resulting returns are solely up to you. Defined-contribution plans lower risk dramatically for everyone. The firms operating them have no long-term risk, because they're not responsible for the accounts (and in fact can't touch them at all), and their workers have no risk that their employers' future problems will affect their retirement.
It would be great if we could wave a magic wand and simply transform Social Security from a defined-benefit to a defined-contribution plan. The problem is that if we switched over all in one stroke -- in other words, if workers began to invest all their payroll taxes in private accounts beginning tomorrow -- then we would have no incoming revenue to pay our obligations to current and future retirees who have paid into the system for decades expecting a retirement benefit. My guess is that those obligations would run into the tens of trillions of dollars.
President Bush's plan essentially says, "Whoa, that's a lot of money. We can't do that. Let's just do a little of it and see how that goes." But that begs the question. Whether you allow current workers to invest a fraction of their payroll taxes or the full amount, you still have to pay the costs of transitioning the system -- it's just a matter of scale.
I seem to remember (I'm being link-lazy this morning) opinion polls showing that more young workers believe in the existence of UFOs than believe they'll receive a Social Security benefit. I don't blame them. They sense that it's a fundamentally unsound system and they want out. And therein lies a potential solution to this problem -- one distinct from raising taxes (as per the Democrats) or borrowing money to fund private accounts (as per the Republicans).
Step 1: We mandate that all future workers enter into a defined-contribution system. All their payroll taxes would go into their own private accounts (with the appropriate checks and balances to prevent outlandish investments).
Step 2: We offer any current worker the choice to opt out of Social Security. They would continue to pay payroll taxes, but all such taxes would go into a private account, with a broad choice of low-risk investments. Beginning immediately, every penny they put away would begin building interest for their own retirement. But in doing so, they would relinquish any claim to future benefits based on their past contributions. In other words, for the privilege of opting out of Social Security, workers would give up the privilege of receiving retirement benefits from it.
What would be the result of this? Countless organizations -- from news outlets to advocacy groups -- would provide analyses showing workers whether it would make financial sense for them to convert. Despite this, some people wouldn't switch no matter what, even if they were very young and had built up only a trivial retirement benefit. That's fine. But many young people would wisely choose to give up their Social Security benefits knowing that they would easily make up the shortfall with their private accounts. And probably more than a few middle-aged people would opt out as well, especially if they had otherwise provided for their retirement.
I don't have access to the data to be able to run the numbers on this, but my hunch is that enough people would give up their retirement benefits to pay most of the unfunded mandate cost. And they would have chosen to do so. No one would be forced to do anything -- no current worker would be forced out of Social Security or have his or her taxes raised, and no current or future retiree would have his or her benefits cut.
As for the remainder of the unfunded mandate cost, we face a shortfall no matter what, and that will need to be dealt with at some point. But this plan could reduce that shortfall, and it solves the problem that no existing plan addresses: how to completely transition Social Security to a defined-contribution system so that future generations don't have to deal with the problems we're facing now.
Boeing announced the new 777-200LR this week, the world's longest-range commercial passenger aircraft. It carries 301 passengers and can fly 9,420 nautical miles... all well and good. So why would Boeing feel the need to exaggerate about it?
The twin-engine airplane, when equipped with three optional fuel tanks, will be capable of flying 9,420 nautical miles, enough to "connect any two cities in the world today," said Lars Andersen, Boeing's vice president in charge of the 777 program at Boeing Commercial Airplanes.Any two cities in the world? How about Honolulu-Johannesburg? London-Auckland? New York-Perth? Or Tokyo-Rio de Janeiro?
I don't understand it. Boeing has the undisputed longest-range airliner. Shouldn't that be enough on its own?
From a story in the Economist on the possible spread of congestion charging from London to other parts of Britain:
[Two] years after London introduced one of the most radical road-pricing schemes in the world, other parts of Britain are still hesitating about whether to follow suit. That is not because London's scheme hasn't worked: on most counts, it has been a triumph. Journey times in the central London zone covered by the scheme are down by a third and air pollution has fallen by 12%. Bus usage is up by more than a third, and 80% of the people who pay the £5 charge for entering the zone between 0700 and 1830 say they are happy with the way the scheme is run.Let's just recap that, shall we?
I'm spending a week in France next month, and thought it might be a good idea to find some appropriate music to take along on my iPod -- something to listen to on the way over and while I'm there. As it happens, the Barnes and Noble I visited had an international music section, and within it, a small number of CDs devoted to France. Unfortunately, most of them were roughly of the "Maurice Chevalier sings about little girls" type, and though I don't have anything against Maurice Chevalier, that's not exactly what I had in mind. And that set me to wondering: are there French-language analogies to my favorite contemporary artists? Not the Barenaked Ladies -- foreign language humor would be lost on me. But who is the Sheryl Crow of France? The Dido? The Moby? Who is the Sting of France?
Okay, scratch that last one. The Sting of France is Sting. But the rest of the question still holds.
The success of the U.S. has not come from one consistent cause, as far as I can make out. Instead the U.S. will find a way to succeed for a few decades based on one thing, then, when that peters out, move on to another. Sometimes there is trouble during the transitions. So, in the early-to-mid-19th century, it was all about expansion westward and a colossal growth in population. After the Civil War, it was about exploitation of the world's richest resource base: iron, steel, coal, the railways, and later oil.
For much of the 20th century it was about science and technology. The heyday was the Second World War, when we had not just the Manhattan Project but also the Radiation Lab at MIT and a large cryptology industry all cooking along at the same time. The war led into the nuclear arms race and the space race, which led in turn to the revolution in electronics, computers, the Internet, etc. If the emblematic figures of earlier eras were the pioneer with his Kentucky rifle, or the Gilded Age plutocrat, then for the era from, say, 1940 to 2000 it was the engineer, the geek, the scientist. It's no coincidence that this era is also when science fiction has flourished, and in which the whole idea of the Future became current. After all, if you're living in a technocratic society, it seems perfectly reasonable to try to predict the future by extrapolating trends in science and engineering.
It is quite obvious to me that the U.S. is turning away from all of this. It has been the case for quite a while that the cultural left distrusted geeks and their works; the depiction of technical sorts in popular culture has been overwhelmingly negative for at least a generation now. More recently, the cultural right has apparently decided that it doesn't care for some of what scientists have to say. So the technical class is caught in a pincer between these two wings of the so-called culture war. Of course the broad mass of people don't belong to one wing or the other. But science is all about diligence, hard sustained work over long stretches of time, sweating the details, and abstract thinking, none of which is really being fostered by mainstream culture.
Since our prosperity and our military security for the last three or four generations have been rooted in science and technology, it would therefore seem that we're coming to the end of one era and about to move into another. Whether it's going to be better or worse is difficult for me to say. The obvious guess would be "worse." If I really wanted to turn this into a jeremiad, I could hold forth on that for a while. But as mentioned before, this country has always found a new way to move forward and be prosperous. So maybe we'll get lucky again.
As is often the case, the most readable coverage of a technical subject can be found in the Economist -- in this case an article on the new Cell microprocessor from IBM, Sony, and Toshiba:
As its name suggests, the Cell chip is designed to be used in large numbers to do things that today's computers, most of which are primitive machines akin to unicellular life-forms, cannot. Each Cell has as its "nucleus" a microprocessor based on IBM's POWER architecture. This is the family of chips found inside Apple's Power Mac G5 computers and IBM's powerful business machines. The Cell's "cytoplasm" consists of eight "synergistic processing elements". These are independent processors that have a deliberately minimalist design in order, paradoxically, to maximise their performance.If the PlayStation 3 does include four Cell processors, and if they run at 256 gigaflops, and if a PlayStation 3 were available today, it would place 387th on the Top 500 list. That's staggering.
A program running on a Cell consists of small chunks, each of which contains both programming instructions and associated data. These chunks can be assigned by the nucleus to particular synergistic processors inside its own Cell or, if it is deemed faster to do so, sent to another Cell instead. Software chunks running on one Cell can talk to chunks running on other Cells, and all have access to a shared main memory. Since chunks of software are able to roam around looking for the best place to be processed, the performance of a Cell-based machine can be increased by adding more Cells, or by connecting several Cell-based machines together.
All of this means that programs designed to run on Cell-based architecture should be able to fly along at blistering speeds—and will run ever faster as more Cells are made available. The prototype Cell being discussed this week runs at 256 gigaflops (a flop -- one "floating-point" operation per second -- is a measure of how fast a processor can perform the individual operations of digital arithmetic that all computing ultimately boils down to). A speed of 256 gigaflops is around ten times the performance of the chips found in the fastest desktop PCs today; the Cell is thus widely referred to as a "supercomputer on a chip", which is an exaggeration, but not much of one. On the top500.org list of the world's fastest computers, the bottom-ranked machine has a performance of 851 gigaflops. A machine based on only four Cell chips would easily outrank this...
Cell's debut will be in Sony's next-generation games console, the PlayStation 3, which is expected to contain four of the beasts.
After talking with my friend and colleague David Smith, I'm convinced the Cell has the potential -- if it lives up to the promises made for it -- to be an industry-changing event. Ray Kurzweil and many others have long argued that at some point, Moore's Law will continue through the use of highly parallel architectures, as opposed to continually increasing the clock speed and word length of today's microprocessors. Much evidence exists for this. Most recently, Apple's chief financial officer called a hypothetical PowerBook equipped with a G5 processor "the mother of all thermal challenges". The Cell addresses such challenges by providing high levels of performance using large numbers of efficient RISC-based processor cores. Instead of one very fast processor, how about eight that are moderately fast? And that's just on one chip.
I'm thinking through the larger implications of this. I'm sure they're not good for Microsoft. The open question is, for whom are they good?
So is the mini a maxi value? For me, clearly, no. When I consider that a good deal of my time is spent running applications like Disk Defragmenter, Scandisk, Norton AV, Windows Update and Ad-Aware -- none of which are available for the Mac platform -- it doesn't make sense for me to "switch" to a Mac at this time.
I'm just back from a trip to Northern VA. Tomorrow I'm out the door for a weekend in Charleston, SC. I return Sunday and then leave Tuesday for Boulder, CO. How am I going to blog over the next few days when already I haven't blogged in over a week?
According to Wired, the dogs aren't eating the dog food at Microsoft:
To the growing frustration and annoyance of Microsoft's management, Apple Computer's iPod is wildly popular among Microsoft's workers.I've had my two iPods -- a 60GB iPod photo that was my firm's holiday present to its employees and a 1GB iPod shuffle that I bought for myself -- for a week now, and I can understand why Microsoft employees would want iPods for themselves. More on my iPod experiences later...
"About 80 percent of Microsoft employees who have a portable music player have an iPod," said one source, a high-level manager who asked to remain anonymous. "It's pretty staggering."
The source estimated 80 percent of Microsoft employees have a music player -- that translates to 16,000 iPod users among the 25,000 who work at or near Microsoft's corporate campus. "This irks the management team no end," said the source.
So popular is the iPod, executives are increasingly sending out memos frowning on its use...
"These guys are really quite scared," said the source of Microsoft's management. "It shows how their backs are against the wall.... Even though it's Microsoft, no one is interested in what we have to offer, even our own employees."
In August 1980, I arrived at the Defense Language Institute in Monterey, CA, 17 years old, still crew-cut from basic training, ready for a year of Russian language studies. DLI was and hopefully always will be a special place -- a beautiful setting above Monterey Bay with an environment more like a college than a military school. The Monterey Peninsula is an upscale area, not endless streets of fast food shops, paycheck loan outlets, and pawn shops like the typical military base town. 24 years later, I still look back on the year I spent there as one of the best of my life -- the time when I really grew up. (Well, as much as I'm likely to grow up, anyway.) And of my closest friends in life, one I met there in my class and another I met through a classmate some years afterwards.
So, though I didn't know him, it was with a twinge of sadness that I read in my DLI alumni newsletter that the last surviving founding instructor of DLI had just died. From the Monterey County Herald's story:
Shigeya Kihara, the last surviving original instructor of a language school for American soldiers that later became the Defense Language Institute, died at his son's home in Castro Valley on Sunday. He was 90.(Incidentally, I didn't know until reading that story that DLI had temporarily been located in Minnesota. Whoever made the decision to move it to Monterey, wherever you are, thank you.)
Born in Suisun on Sept. 27, 1914, Kihara was raised in West Oakland and earned undergraduate and graduate degrees in political science at University of California at Berkeley.
Kihara was one of the first four instructors at a school that was the first of its kind and had no budget. Soldiers trained there proved to be an invaluable asset during World War II and the school itself led to the establishment of the well-known institution on the Monterey Peninsula.
Classes began in November 1941 in an airplane hangar at the Presidio of San Francisco's Crissy Field. Five weeks later Pearl Harbor was attacked and the language school was moved from the unstable West Coast to a safer location in a small town in Minnesota.
When the war ended, 6,000 linguists had graduated from the original school.
In 1946, Kihara moved with the school to its new location, the Presidio of Monterey.
While Japanese-Americans and Japanese immigrants around him, including his parents and two siblings, were rounded up and taken to internment camps during the war, Kihara taught American soldiers how to speak Japanese.
"The Nisei helped win the war," Kihara said in 2001.
So, Kihara-san, from this student who never knew you, thank you for making DLI happen.