Gas prices will stay below $3 (redux)
On Friday, oil prices fell to below $30, the lowest in 12 years. Wall Street journal obviously had an immediate explanation: turmoil in the Chinese market and an expected increase in Iranian crude production:
Decade after decade, business press invariably have just the right explanation for market movements – as long as they get to sleep on it.
Back in October, 2008, I wrote a blog post entitled Gasoline prices will stay below $3. The recent sharp decline in prices prompted me to re-visit the prediction. It’s a curious thing about predicting the future. It’s not so hard. The real trick is figuring out the timing (and then figuring out what to do about it). To recap, here’s what I said:
But, you say, gasoline will keep getting more expensive, won’t it? Nope, it won’t. In fact, I’ll “betcha” that it will hold (inflation-adjusted) steady below $3.50 for the foreseeable future, and I’m pretty sure it will stay below $3, where it’s headed now.
My core argument was that there are plenty of carbon-based energy reserves of various sorts that become economical at different price levels. Upwards $100 per barrel (corresponding to about $4 per gallon in gasoline) the supply starts approaching “infinity”: coal-to-gas and tar sands each on their own correspond to about a century’s worth of energy consumption.
The chart shows crude oil prices and (national average) US gasoline prices for the period 1982-2014. Prices are indexed to 2008 dollars to track my prediction.
My bet back then was that it definitely won’t go above $3.50, and not likely above $3. Turns out that I was partially right: it peaked at $3.43 so I won the bet with a whisker. But it stayed above $3 for several years. That made little sense to me – crude oil sustained above $75 (which corresponds to $3 gas) allow vast energy source alternatives to become economical.
To recap from 2008: Gasoline comes from oil. Though a barrel of oil is 42 gallons, only about 19 gallons of gasoline is produced during refining – the rest includes diesel (about 9 gallons), jet fuels (about 4 gallons), heating oil (about 2 gallons), and another 10 gallons of assorted products. These resulting products complicate the demand side of the pricing equation, since they cater to related, but different, markets. On the cost side, there’s also refining, distribution, and taxes.
I’ve improved my rule-of-thumb from 2008: if you want to know roughly what the gasoline price will be in the US based on the price of crude oil, just divide by 35 and add $1. That rule would have gotten you within 5% of the actual gasoline price for more than half of all years since 1973, and almost 90% of all years since 1991. (Remember to inflation-adjust the “$1” part of the rule.)
At the current $30 prices, that means gasoline at $1.85 or lower. Note that if you’re in California, various market inefficiencies lead to about $0.15 higher prices, so you should add that to the rule. That means at below $30 per barrel, we’re heading below $2 gasoline.
There are fundamental reasons for making long-term judgment calls on pricing. Way back in 2005, in Wired issue 13.12, they had a look at what energy sources come online (economically speaking) as the price of oil rises. I haven’t seen a similarly good overview put together since. But I did briefly check a subset of their table: ultradeep offshore wells, tar sands, coal to gas, natural gas, ethanol, and biodiesel are all economical at prices of $3.00-$3.50/gallon or less. And of course, since 2005 we’ve added fracking to that list.
That’s a blend of renewables and massive reserves (in the trillions of barrels of reserve). So the prediction (inflation-adjusted) remains that gasoline has little reason to go into that price range for long, let alone above it.
Shakuntala Devi – the Human Computer
This was originally posted as an answer on Quora to the question “How did Shakuntala Devi mentally calculate the 23rd root of a 201-digit number correctly?“.
Devi was a very talented mental calculator, but by no means the extreme prodigy that she has been made out to be in the media, and she certainly couldn’t take the 23rd root of a number “faster than a computer”.
She was an accomplished showman – she grew up traveling with her father and giving performances, and when her father got too old, she travelled by herself and sent money to her family. The modern field of mental calculators generally shies away from words like “prodigy” or “genius”. Mental calculation is obviously related to mathematics, but many of the historically great mental calculators were not particularly knowledgeable in mathematics per se.
By calling her an “accomplished showman”, I don’t mean to denigrate her at all, quite to the contrary. It’s an intricate, and old, art form, dating at least to the 1700s, more akin to the work of an illusionist than that of a professor in mathematics. It takes hard work and dedication to be a good illusionist of any sort. Devi was born, and performed, in a simpler era, one before the internet and computers that have deconstructed much of the “magic” of these artists. And her story in particular – as a poor child traveling in India, earning a living for her family with her demonstrations – holds a deep appeal to us. Especially since, by all accounts I’ve read of those who met her, she was a very special person.
(If you want a demonstration of the “powerful magic” of mental calculation, just search the internet for the tricks involved in solving 3rd roots and 5th roots of large integers in your head. You will astonish your friends and family by just a little effort of rote memorization.)
There are a few things you first have to know about mental calculations of this sort. They are mostly related to memorization and pattern matching, and putting various significant constraints on the problems being asked. If you are not versed in the arts, some tricks that appear very difficult are in fact quite simple. That, after all, is the general art of illusion.
Today, this art form is called “mathemagic”, and notable “mathemagicians” include Martin Gardner, Arthur T Benjamin, Raymond Smullyan, and Persi Diaconis.
Let me take an example from Devi’s life, before I jump into the actual question. Famously, Devi travelled to the US in 1988, including a visit to San Francisco. A Psychology Professor at Berkeley attended one such event, and he published an article about it in 1990 (“Speed of Information Processing in a Calculating Prodigy”, Arthur R. Jensen, INTELLIGENCE 14, 259-274). Jensen describes how he gave two questions to Devi on note cards that he had prepared:
“When I handed Devi two problems, each on a separate card, thinking she would solve first one, then the other, my wife was taken by surprise, as there was hardly time to start the stopwatch, so quick was Devi’s response. Holding the two cards side-by-side, Devi looked at them briefly and said, “The answer to the first is 395 and to the second is 15. Right?” Right, of course! (Her answers were never wrong.) Handing the cards back to me, she requested that I read the problems aloud to the audience. They were: (a) the cube root of 61,629,875 (= 395), and (b) the 7th root of 170,859,375 (= 15). I was rather disappointed that these problems seemed obviously too easy for Devi, as I had hoped they would elicit some sign of mental strain on her part. After all, it had taken me much longer to work them with a calculator”
Let’s look closely at the first problem, finding the cube root of 61,629,875. The third root of an 8-digit number will have exactly three digits (because it will be between 215 and 464). The third digit is trivial, because a number cubed, modulo 10, only depends on the last digit. In fact, most of the digits are identical – if the last digit of a cube is 1, 4, 5, 6, 9, or 0, then it’s the same in the root; 2 and 8 swap places as do 3 and 7. So we know the last digit is “5”. The first digit is trivial too, in an 8-digit cube it can only be 2, 3, or 4. In this case 61 is close to 64, which is 4 cubed. In fact, it’s very close, so we know the answer will be just below 400, because 400 cubed is 64,000,000. Now, what 3-digit number is just below 400 and ends in 5?
That’s why it only took Devi a few seconds to answer. Jensen, a psychologist, doesn’t know the art. Just as it takes a magician to explain what another magician does, it takes an accomplished mental calculator to explain the field to you. What seems amazing (cube root of 61,629,875 in two seconds!) might be straight forward if you know the “trade”.
Though Jensen was not an appropriate judge of these mental feats, he did subject her to a full range of psychological testing, and those are the parts that are valuable reading in his article. Notably, she didn’t score exceptional in any cognitive area, except memory for numbers.
In Devi’s case, we don’t know what techniques she used for this demonstration. Why? Because she never told people. And why would she. She had made a living since kindergarten as a performer. Professional performers don’t reveal their magic. If they do, it’s not magical any more.
But lots of other mental calculators have shared details, many at least on par with Devi’s abilities and several distinctly stronger – people like Wim Klein, Hans Eberstark, Gert Mittring, and Alexis Lemaire, the latter likely the greatest living mental calculator.
Now, let’s get a bit closer to the actual question.
The original story is that in 1977, at the Southern Methodist University in Dallas, Devi extracted the 23rd root of a 201-digit number. As reported at the time (Dallas Morning News, January 26th and later a bit more sensationalized in an editorial on February 6th), the problem was presented to her from a calculation on a Univac 1101 at the Bureau of Standards, and it took the computer over a minute to calculate it, but it took Devi just 50 seconds. Therefore, she had “beaten the computer”, which became the theme of the media coverage and greatly contributed to her fame.
The problem, as posed, was to take the 23rd root of this number:
916, 748, 676, 920, 039, 158, 098, 660, 927, 585, 380, 162, 483, 106, 680, 144, 308, 622, 407, 126, 516, 427, 934, 657, 040, 867, 096, 593, 279, 205, 767, 480, 806, 790, 022, 783, 016, 354, 924, 852, 380, 335, 745, 316, 935, 111, 903, 596, 577, 547, 340, 075, 681, 688, 305, 620, 821, 016, 129, 132, 845, 564, 805, 780, 158, 806, 771
The answer (in case it’s not immediately obvious to you) is 546,372,891.
The first thing to note was that the Univac 1101 and Devi were solving two completely different problems. Raising a nine-digit number to the 23rd power – e.g. multiplying a number by itself 23 times – is a much harder problem than taking the 23rd root – if you know that the result will be an integer.
(Parenthetically, the “1101” is probably a typo or some other error. That model is a 1951 vacuum-tube computer that could do about three multiplications per second. Other reports said Univac 1106, or 1108, but that was from 1969 and also a bit old at the time. The “Bureau of Standards” was responsible for running US census data, and thus the principal government progenitor of much of the computer industry. Given the year of 1977, my guess would be that it was actually a Univac 1110, which launched in 1972. Though why anybody would chose a Univac in 1977 is another mystery, since by then IBM had long taken over the industry. But “Univac” was famous and known by readers, after having predicted the Presidential election in 1952 live on television. So it was probably equivalent in people’s minds to “big computer”.)
In general, the difficulty lies not in the size of the number (201 digits) or the power (23). It’s the number of digits in the answer. Now, the 23rd root of a 201-digit number will have nine digits. Remember the example above, of a 3rd root that resulted in a 3-digit answer? The beginning and the end are trivial, the middle requires some thought.
The same principle continues to apply to larger powers – except that the notion of “trivial” becomes “quite difficult”, and the notion of “some thought” becomes “that’s really hard”.
In this case, nine digits, the middle two are the hard ones – the first three and the last four are “easy”.
The last four (2891) of the answer is completely determined by the last four (6771) of the cube.
If you want to test that, try raising the number xxx,xx2,891 to the power of 23 – and insert any numbers you want for the “x”:s. In fact, you can take any integer that ends with “2891” and take it to the 23rd power, and the result will end in 6771.
To know what four digits translate to what four digits, there are a bunch of tables you need to memorize, but that’s not as hard as it sounds. For example, the choice of “23” is not a random. Powers of integers where the power is of the form (4n+3) have (simple) patterns for the trailing digits. In particular the last digit is unchanged so you don’t have to remember it at all. Choosing appropriate power (23 in this case) translates to simpler tables to memorize for this step.
Now, look carefully at the first six digits – “916748”. The number “48” is right next to “50”, which means this six-digit number is halfway between “916700” and “916800”. We’ll soon see how this helps.
Let’s start by taking the first four (9167) and factoring it – that’s 89 * 103. Next we apply log tables – at least that’s one of the standard techniques. For this step, I’ll assume Devi has memorized the first 150 or so logs to five decimal digits.
Log (base 10) of each factor is 1.94939 and 2.01284, respectively. Adding the mantissas yield 0.96223. Remember that log(xy) = log(x) + log(y).
Now let’s grab the “other” number, 9168. That factors into 48 and 191. Again taking the mantissas of the logs and adding them we get 0.68124 + 0.28103 = 0.96227.
Since “48” is so near the middle between 0 and 100, in this example interpolation between 0.96223 and 0.96227 is simple, we get 0.96225.
Thus we know that the log base 10 of the whole 201-digit number is approximately 200.96225. We divide this by 23, and we get 8.73749 (we can first simplify by noting that 200 divided by 23 is 8 with remainder 16, and instead divide 16.96225).
The trick now is to estimate the anti log for 0.73749. The more accurate we get it, the closer to the correct answer we are.
There may be more clever techniques at this stage than I know, but if we’ve memorized logs for all numbers up to 1000, then we know:
mantissa(log(546)) = 0.73719
mantissa(log(547)) = 0.73799
Now these logs are 0.0008 apart, so we linearly interpolate 0.0003 into this: and 3/8 = 0.375. This is a curiously simple interpolation.
So our estimated antilog of 0.73749 is 5.46375.
Now it’s a bit tricky, do we round up or down? Does “75” become “7” or “8”? It’s not as hard as it sounds, since “75” is borderline the answer is easy: logarithms grow slower than linear so interpolation will slightly over-estimate.
So we finally have our answer: the first five digits are 54637, and from earlier we knew the last four digits are 2891, and we get:
Simple? Haha, no, not especially.
Did Devi have to memorize 1000 logarithms to 5 digits? That’s not as hard as it sounds for somebody with (a lot of) talent for remembering numbers. There are clear patterns.
The biggest demand for large log tables is in the precision of the antilog. If she instead had memorized “only” 100 log entries, she would be interpolating between these two mantissas:
mantissa(log(54)) = 0.73239
mantissa(log(55)) = 0.74036
With linear interpolation she would get (0.73749 – 0.73239) = 0.00510 which then divides into (0.74036 – 0.73239) = 0.00797, for an estimate of 5.4640. That’s a little far from 5.4637.
This is where the real talent kicks in. In the 23rd root of a 201-digit number, the first 3 digits, and the last 4, are trivial. The middle two are tricky. You either memorize a log table with 1000 entries, or you have some clever tricks for iterative antilog interpolation, or you’ll get one or two of those digits wrong. On that day in 1977, Devi got it right.
But wait: how was the number “201” chosen, as in, “23rd root of a 201-digit number”? If it was Devi that chose the number, then the log tables you need are much smaller. For a 201-digit number, the 23rd root will have the first 3 digits in the range of 496 to 548. That’s not 1000 different logs to remember, that’s only 53, that’s 95% less stuff to remember for that stage. Because 9-digit numbers can be anywhere from 185 to 207 digits long. Limiting it to 201 digits simplifies it a lot.
Is that what happened? Quite possibly, because we have another hint that I will wrap up with: at the time in 1977, it was reported that “somebody” was worried that she would simply memorize all possible roots. It was reported at the time that “the computer was asked” what the probability was that she guess the right answer, and it reported that the odds were 1 in 58 million. This number also became a part of the legend (mis-characterized as “the odds of her doing this feat was 1 in 58 million”).
I don’t have a Univac 1101, but I with slightly more recent tools I can calculate that for the 23rd power to be 201 digits, the root needs to be between 496,194,761 and 548,441,657, precisely.
In other words, there are 52,246,897 possible numbers if her request was “only 201-digit powers”. That’s awfully close to “58 million”. Somehow she limited her range. Either they miscalculated the number “58 million” back then, or the range was defined in some other manner. What matters is the log function: no matter which range it was that translated into 58 million numbers, she would only need to remember 58 or 59 entries of a log table: we’re talking log 10 here, and it’s only the resolution of the leftmost 2 digits that matter (58).
However she did it, it was quite a feat. She would have had to do all of the above, accurately, in 50 seconds. And note that how she actually went about it might have been even more complicated than what I described; in my reconstruction of her feat, I’ve tapped into the best of (known) modern mental computational techniques. She might not have been familiar with all of them, and in 1977 she most certainly didn’t have easy access to computers with which to try things out.
One comment about the 50 seconds part. We don’t know how long she thought about the problem. If you review carefully the above techniques, you will notice something interesting: the hard part is figuring out what to do with the first five or six digits. The following 191 digits don’t matter – at all! They can literally be anything. So, if you structure your show appropriately, you might for example ask that the number be written on a blackboard. And indeed in this case, judging by pictures that have survived, it was written on a blackboard. And the numbers were grouped in groups of 5 numbers. So while somebody is writing 191 irrelevant numbers, Devi can crunch the hard part of the problem – going from the first 6 numbers of the power to the first 5 numbers of the answer. When she sees the last 4 digits, she has the answer, since this part is simple. If the person writing on the blackboard writes 1 or 2 numbers per second, Devi would have had 2 or 3 minutes to spend on the hard part.
For the record: if a computer had been programmed to use these techniques, the “compute problem” is trivial. So no, she didn’t “beat a computer” that day. What she did was confound her audience, in a manner that rung through the decades.
For a human being it was a formidable demonstration, and for a consummate mathemagician, quite a trick to be remembered for.
Please note that there were a large number of comments on Quora to the above answer that I responded to.
Joined Snapchat as VP Eng
Yes indeed, as WSJ reported today (“Snapchat Poaches Google Veteran in Engineering Push”), I have recently joined Snapchat.
This is a great place. Hope to get some bandwidth to share thoughts on new generation of communication. For now, I want to link to two items that might interest anybody interested in why I joined:
Evan Spiegel, CEO and co-founder, gave a great keynote at AXS earlier this year. Transcript well worth reading.
Just a bit before the timing of that keynote, Nathan Jurgenson wrote a great piece entitled “The Frame Makes the Photograph “, that is likewise well worth the read.
Not much original thinking from yours truly to share today. Mostly standing on shoulders at this point. Stay tuned.
A youngster corrected me the other day when I said “Merry Christmas”. He looked at me firmly and replied, “Happy Holidays”. (Actually, this happened a year ago, but some blog postings take a while to finish.)
Most years here in the US there’s a public squirming about how to manage both the culturally inclusive nature of the US, and the peculiarly American notion of Christmas. Instead of the (to some) culturally offensive reference to Jesus Christ, variations like “Happy X-mas” and “Season’s Greetings” are introduced. Worst of all is when there’s pretense of inclusiveness by adding things like Hanukkah and Kwanzaa to the mix – but more on this later.
Certainly, mid-winter celebrations are an old notion. Celebrating Winter Solstice predates all current organized religions. The Newgrange Megalithic structure, which provides the clearest evidence, is about 5200 years old: at dawn, from around December 18th to 23rd, a narrow beam of light penetrates the roof-box and reaches the floor of the chamber. Some 25,000-30,000 people participate in the lottery each year for one of the 50 (!) tickets. Ġgantija in Turkey is even older (in fact, the world’s oldest ruins), but is not in a good enough shape to confirm alignment with mid-winter sunrise. To put this in perspective, Newgrange is more than 500 years older than the oldest pyramid.
The reason is obvious: in a society intimately dependent on the whims of nature, the steadily shortening days of sunlight is frightening, and the turning point, with the shortest day and longest night, is a time to celebrate: to say goodbye to the old, and to welcome the new. (And in the case of frigid cultures, to mark a suitable time for winter slaughter.)
Winter Solstice is easily predictable. When it occurs in the calendar depends on the calendar. The Julian calendar from 45 BC defined Winter Solstice as occurring on December 25th, a date that in the fourth century the new Christian church under Pope Julius (the first) decided should be the birth day of Jesus Christ, thus “coincidentally” replacing the Roman celebration of Saturnalia, which in turn dated back to ancient celebrations of the Sun God.
The popular notions that are widespread around Christmas today are largely popularized by the US in modern times. By now most people have heard that our image of Santa Claus was invented by Coca Cola – or specifically, by the (ethnically Swedish) artist Haddon Sundblom in 1931. It was part of Coke’s business challenge to recast the soft drink away from being seasonal, an effort that began around 1922 with the slogan “Thirst Knows No Season”. St. Nick was the icon of winter, so having him drink Coke was brilliant marketing, addressing the business challenge head on. Sundblom took inspiration from Moore’s 1822 poem “A Visit From St. Nicholas” (also known as “‘Twas the Night Before Christmas”) and the color red from earlier renditions (and supposedly only coincidentally the same color as Coke’s logo).
The Christmas tree as we know it today came through the British royal household. Introduced in the early 19th century by George III’s Hanoverian Queen Charlotte of Mecklenburg-Strelitz, Queen Victoria grew up with the specific tradition of presents around the tree. After her marriage with her cousin Prince Albert reinforced the tradition, the custom gained popularity in Great Britain, and notably was reproduced by Godey’s Magazine and Lady’s Book for their Christmas issue in 1850. Like Coke, this was no small commercial matter: Louis Godey was the first to copyright every issue of his magazine starting in 1845, and it was the most popular journal publication of it’s day (as a percentage of GDP, I estimated that the magazine in 1850 was similarly sized to the New York Times today).
By 1870, the tradition had become ingrained to the point that Congress declared it a federal holiday (together with Independence Day, New Year’s, and Thanksgiving, though at the time it only applied to federal employees in the District of Columbia, about 10% at the time).
Not coincidentally, just a few years after Queen Victoria’s marriage, in 1843 Charles Dickens published “A Christmas Carol.” A best seller of it’s time, Dickens aimed to “raise the Ghost of an Idea” (his words) of re-casting old traditions of 12-day Yule celebrations, a luxury the common folk could hardly entertain (and that was a factor in the mid-seventeenth century Cromwellian Revolt that abolished the celebration of Christmas as well as the monarchy), and to instead have one day of lavish family celebration.
These secular traditions were all controversial in their time. Christianity as an organized religion resisted them, but as so often, eventually adopted an “embrace and extend” approach (e.g. nativity scenes under the evergreen). You’ll still find religious controversy to this day.
Some brief comments about the misguided inclusiveness notions of Kwanzaa and Hanukkah: Kwanzaa was deliberately invented and promulgated in 1966 by Maulana Karenga, professor and chairman of Black Studies at California State University, as an amalgam of “African” traditions (real and imagined). It’s particularly bizarre since many words in the Kwanzaa tradition (including the word “Kwanzaa”) are from Swahili. Now, Swahili is more of an African-Arab creation. In fact, the word “Swahili” derives from the Arab word “coast”, and was the (Arab) reference to the people “of the coast”, e.g. Eastern Africa. Hardly a genuine old tradition.
Hanukkah (where the leading ‘H’ is the ‘het sound, a sound that doesn’t exist in English but English speakers will be familiar with the similar ‘ch’ sound in the Gaelic word ‘loch’) is not quite as artificial as Kwanzaa, but merits some comment. Tradition says that Hanukkah celebrates the miracle of the Menorah (a seven-branched candelabra) that the Maccabees found after their victory over Syrian forces in the 2nd century BCE. But the prominence of the celebration did not develop until the late 19th century in eastern Europe as a part of the Zionist movement – and the development of a heroic tradition that grew into the creation of the State of Israel. It was more a political and cultural expression of identity than a religious celebration. (*)
What strikes me about these religious aspects of the holidays is indeed the lack of inclusion. Various religions will independently assert “embrace and extend” around what appear to have been generic, communal celebrations of the season. And grouping a set of non-inclusive approaches into a sentence does not make it inclusive. All you get is a set of separate groups of people.
To be inclusive is to emphasize the underlying theme of this time of year: I’d like to suggest the term used by my Nordic ancestors, and which still lives on, somewhat obscurely, in English: Yule. Though the etymology is in some dispute, I believe (probably for sentimental reasons) the version that the word derives from the old Nordic word “Hjól” which means “wheel”. The reference is to the completion of the year – the turn of the seasons has made a full revolution. Yule itself was a Germanic pagan celebration that dates to well over 1000 years ago. I’ll gloss over the fact that Yule was probably the full moon closest to the Winter Solstice, and not the solstice per se, though this does make 2010 an auspicious year to reset traditions: there’s only been one previous time since Year One (1) when a total lunar eclipse coincided with winter solstice, and that was 1638 (the next one is 2094).
While I’m at it, I will also further suggest that we merge Christmas and New Years into a week-long celebration of life, and change our Gregorian Calendar so that all months have 30 days, the remaining 5 or 6 days (depending on the year) being “Yule” or “Yuletide”. The tree is the Yuletree. This is something all people of the world can celebrate, together (The Southern Hemisphere is reversed, of course, but 90% of humans live on the Northern Hemisphere). Regardless of religious outlook, we can all celebrate life and family.
So, to all of you: Have a very Happy Yule!
(December 2014: minor fixes and alterations, including addition of Ġgantija reference, and the footnote below.)
(Footnote (*) from 2014 with more details than you probably want to read: Something interesting about the choice of the Maccabean Revolt for Hanukkah is that it celebrates an event that predates Judaism; what constitutes “Judaism” today is rabbinic Judaism which evolved out of the Levant around the 1st century C.E. Christianity was a roughly contemporary sect of Judaism. Both have the Masoretic Texts as the central scriptures. Today’s strict division between Judaism and Christianity did not happen until well after the destruction of the Second Temple in 70 C.E. Historians refer to the early sect as Jewish Christians, who in the first few decades after Jesus’ crucifixion around 30 C.E. would have viewed themselves as Jewish. The details have been hotly debated over the centuries, but some versions of history would have the killing of James the Just, Jesus’ brother and leader of his church after his death, as the trigger to the Jewish revolt that eventually led to the destruction of the Second Temple. Other versions hold the reverse, that James was killed in the Roman sacking of Jerusalem. Regardless, the destruction led to a dramatic reduction in the number of surviving copies of the Masoretic Texts. Now, the events marked by Hanukkah, namely the rededication of the Second Temple, occurred in 160 B.C.E. – that’s 230 years before the destruction of Jerusalem. In contrast, the current version of the Tanakh is thought to date to around 450 C.E., with the oldest surviving version being the Aleppo Codex from 920 C.E. The Christian Bible is both older and newer (as a defined collection of both OT and NT) and with a remarkably complex history: the oldest fragments of the NT are from around 120 C.E., the canon was mostly defined by the 200s, though a full consensus wasn’t reached until the 5th century. The oldest complete version of the NT is the Codex Sinaiticus from the 4th century.)
Joined Google, working on the “cloud”
Just a quick status update. A few months ago I started talking to Google about their “cloud” strategies. Today I’m starting week five as a Googler, working on just that. Can’t say much just yet, but it’s pretty cool. Stay tuned.
The Internet Revolution – History and Significance
Today I needed access to some old files. I had an old tar file from around 1998 that held several gigabytes of files. It came from large Unix servers, and for many years I had no easy access to it’s contents – Windows would cringe at trying to access the large number of files (43502 files, some very big and some with very long file names). Solaris, of course, could easily handle all of it (10 years earlier).
But now my main work horse is the new i7-based MacBook Pros – so again (10 years later) I am now working with 64-bit Unix on a proper file system with multiple processors. So today I pointed my laptop at this old set of files. Mac OS X had no problems at all, and Spotlight reindexed everything in a matter of minutes.
One of the fun things I found was a PowerPoint that I made in 1997. It’s from a presentation I was invited to hold at Svenska Dagbladet in February of that year. SvD was at the time one of Sweden’s two large national newspapers. There was a debate within SvD about how significant the Internet would be for print media. I was somewhat established in the space at the time, having written a number of articles in 1995 and 1996 and held a few invited talks on the overarching topic of the significance of the Internet. I had personal web pages from April 1993, and had one of the first (if not the first) columns in mainstream print media that was connected to a web page where readers could post comments (first such column was in October 1995).
Tomorrow I will be attending IJ-7 at Stanford, so I was curious as to how well my old observations on the impact on Media would pan out.
See for yourself! 🙂
Joined Conformiq as VP of Marketing
(Update: left in September.)
Wind River (Intel) acquires Virtutech
Well, it’s official now: Wind River acquires Virtutech, the company I founded in 1998 together with a brilliant team of four co-researchers from SICS – Bengt Werner, Andreas Moestedt, Magnus Christensson, and Fredrik Larsson.
Simics will live on, but that wraps up Virtutech, and thus the end of an almost 19 year long project. So, for the history books, some retrospective …
Why are there 5280 feet in a mile?
The original Roman mile was 1000 paces (milia passuum), or 5000 feet. The modern mile was defined as 5280 feet under Queen Elizabeth at the end of the 16th century in order to reconcile multiple discordant measurement systems already in wide use. In particular, it was convenient to make it an even multiple of the sides of an acre, which since early medieval times was a rectangle 660 feet long and 66 feet wide. 660 divides neatly into 5280, eight ways, allowing a square mile (640 acres or a “section”) to be conveniently quartered three times over: into quarter sections (160 acres), sixteenth sections (40 acres), and finally the 10 acre squares (660 by 660 feet) that remain at the base of all US land surveys to this day.
The Fourth Wave
Some 4000 Google I/O attendees gave a standing ovation at the end of this morning’s keynote pre-launch of “Google Wave”. Google I/O attracts a reasonably savvy crowd, and this was not a Reality Distortion effect. What Google announced this morning is significant. It is the first candidate killer application for the Fourth Wave of Computing.
Google Wave is a smooth hybrid of email, instant messaging, photo sharing, discussion forums, wiki, and document management. It is best described in the words of the brother of its lead developer, who also delivered the bulk of the keynote:
In Google Wave you create a wave and add people to it. Everyone on your wave can use richly formatted text, photos, gadgets, and even feeds from other sources on the web. They can insert a reply or edit the wave directly. It’s concurrent rich-text editing, where you see on your screen nearly instantly what your fellow collaborators are typing in your wave. That means Google Wave is just as well suited for quick messages as for persistent content – it allows for both collaboration and communication. You can also use “playback” to rewind the wave and see how it evolved.
Lars Rasmussen, The Official Google Blog, 5/28/2009
The features are impressive, and the demonstration was awe-inspiring. We were treated to a symphony of technologies. What Google is cooking up is a blend of technologies and trends, and is not entirely simple to dissect.