The Books Ngram Viewer from Google Labs provides a fascinating insight into language usage in the past 200 years. An Ngram is a series of one or more items from a sequence, in this case a word or phrase from a published text. Google’s viewer plots the frequency of occurrence for Ngrams found in books published since 1800. It is possible to narrow the search to specific collections of books or corpus. Available corpora include American English, British English, English Fiction etc. Researchers at Harvard University’s Cultural Observatory have put together some tips for using this data and have invented a new word
Culturomics – The application of high-throughput data collection and analysis to the study of human culture.
Zachary Sniderman on Mashable published an interesting article titled, Just How Open Is Your Internet?. I thought the featured map looked a little bit fishy – Mongolia has less internet censorship than Sweden? Really? What does that mean? It seemed to me that there might be something missing from the account. To his credit Mr Sniderman does note
…it raises some inherent problems with defining “censorship.” For example, screening out child pornography and illegal file sharing technically registers as “censorship” even though most people wouldn’t consider that a human rights offense.
Even accounting for his concerns it still looked a bit odd to me.
The Antikythera Mechanism was discovered on May 17th 1902, by archaeologist Valerios Stais when he was diving on the Antikythera wreck off Point Glyphadia on the Greek island of Antikythera. The wreck is believed to have sunk in the 1st century BCE and has yielded many spectacular artifacts. The most mysterious of these is the Antikythera mechanism, a solid lump of corroded bronze gears. It has taken over a century, the latest imaging technology, and decades of research from a few dedicated scholars of mechanical engineering to piece together what the mechanism did.
I put together this playlist of short youtube videos. Together they describe the latest advances in understanding the mechanism and how it worked. There are four videos that take about 20 minutes to watch. Just click on the video and all four will play.
I have been experimenting with Python 2.3 and MySQL 4.0.13 recently and have been using a copy of my Movabletype 2.661 database as a sandbox. Before I started a spent a few minutes working out the structure of the database. This Entity Relationship Diagram is what I came up with. I expect version 3.0 of Movabletype is different, but just in case anyone else is digging around in Movabletype and could use a map here is a pdf version.
This is the home of the Computer Evolution File. This file attempts to provide a comprehensive graphical representation of the evolution of the modern computer for the period 1934 to 1950. The file is licensed with an attribution, share alike creative commons license. Please feel free to download and make improvements and derivative works. Please send a copy of changes to me and I will share the updates on this page.
foobar @ bigfoot.com
Some time between 1934 and 1950 the first modern computer was created. Pinning down exactly when that event occured is not easy. It depends on how you define the term computer and what you think is more important: The concept, the design, the first succesful test, or the first time the machine solved a real problem. In those early days it usually took years for a team to progress from concept through design to working machine. There were many such teams working mainly in the US and UK. These teams competed and cooperated somtimes they shared ideas and designs, and they sent representatives to visit each others laboratories. On one famous occassion in the Summer of 1946 almost all the leaders in the field got together at the Moore School for an 8 week long series of lectures. In short the story of the emergence of the modern computer is a complex one that involves both direct and indirect contributions from many people.
There are many Computer History Timelines in existence. But all of these suffer from the same flaws. They are incomplete and thier linear nature fails to capture the complex web of influence that was the hallmark of computer development.
Downloadable files available here
In an effort to visualize this web of interaction. I have started to develop a graphical representation of the evolution of the modern computer. Fortunately AT&T have kindly released a package called Graphviz which is capable of drawing complex directed graphs. The graph above is produced by Graphviz from a text file.
The text file contains a detailed description of my approach, the classification I have used, and lists all the machines and the references to the data sources I used. I have not duplicated that information here because the whole point of the exercise is to gather all the data in one place.
I have licensed this file with an attribution, share alike creative commons license. So please feel free to download and improve what I have started. If you do make changes please send me a copy and I will share the updates on this page.
For the record. I believe that The Manchester Mk I Prototype was the first Computer in a modern sense. But the text file is not intended to prove this or any other machine was first. It is only intended to record the known dates and influences for computing machines designed between 1934 and 1950. I Believe that the graph is complex enough to support many interpretations.
RMS (Risk Management Solutions) is a small US company that specializes in catastrophe models for the insurance industry. These models cover natural perils such as earthquakes, hurricanes, and other windstorms. In 2002 RMS produced a report entitled Accessing Workers Comp Risk from Earthquakes. What if the 1906 Great San Francisco Earthquake occurred today. The point of this report was to draw the attention of catastrophe risk managers in the insurance industry to the potentially high costs of workers compensation in large catastrophes. It also makes fairly sobering reading for people who work in San Francisco.
RMS assumed the replay of the Great Quake would occur at peak office occupancy hours; mid-afternoon, mid-week. The Diagram below shows the relative ground shaking used to calculate potential losses
The following table shows the potential losses in workers compensation from a repeat of the 1906 earthquake compared to equivalent losses from the World Trade Center Attack.
||1906 San Francisco Earthquake Repeat
||World Trade Center Attack
|Workers Compensation Injuries
|Workers Compensation Deaths
|Workers Compensation Insured Loss
||$2.5 – $5.0 billion
Amazing and totally inaccurate!
It’s difficult to know where to start in cataloging the faults with this diagram. So rather than waste my time trying I’ve started collating information to produce a better version. I’ll publish it here when I’m done.
I’m not sure where this diagram came from. It could be “Computer Structures” by Gordon Bell. But the reference was unclear.
In 1999 Ryan McCormack and I wrote a marketing piece on Globalization for Sapient Corporation. Aimed primarily at raising awareness of the issues involved in building global Internet systems it also touched on national market analysis and selection. I was reminded of this diagram showing income and connectivity for every country in the world from that piece while reading various articles on US Foreign Policy recently. These articles included this piece called the Pentagons new Map by Dr Thomas P.M. Barnett a US military Strategist on Globalization and US Foreign Policy and this more
>jaundiced view from the Washington Monthly.
Basically I think Dr Barnett is on to something when he claims that “disconnectedness defines danger”.
Show me where globalization is thick with network connectivity, financial transactions, liberal media flows, and collective security, and I will show you regions featuring stable governments, rising standards of living, and more deaths by suicide than murder. These parts of the world I call the Functioning Core, or Core. But show me where globalization is thinning or just plain absent, and I will show you regions plagued by politically repressive regimes, widespread poverty and disease, routine mass murder, and “most important” the chronic conflicts that incubate the next generation of global terrorists. These parts of the world I call the Non-Integrating Gap, or Gap.
Having said I think Dr Barnett is on to something I don’t think his conclusions are correct! He makes three mistakes;