The Books Ngram Viewer from Google Labs provides a fascinating insight into language usage in the past 200 years. An Ngram is a series of one or more items from a sequence, in this case a word or phrase from a published text. Google’s viewer plots the frequency of occurrence for Ngrams found in books published since 1800. It is possible to narrow the search to specific collections of books or corpus. Available corpora include American English, British English, English Fiction etc. Researchers at Harvard University’s Cultural Observatory have put together some tips for using this data and have invented a new word
Culturomics – The application of high-throughput data collection and analysis to the study of human culture.
Zachary Sniderman on Mashable published an interesting article titled, Just How Open Is Your Internet?. I thought the featured map looked a little bit fishy – Mongolia has less internet censorship than Sweden? Really? What does that mean? It seemed to me that there might be something missing from the account. To his credit Mr Sniderman does note
…it raises some inherent problems with defining “censorship.” For example, screening out child pornography and illegal file sharing technically registers as “censorship” even though most people wouldn’t consider that a human rights offense.
Even accounting for his concerns it still looked a bit odd to me.
The Antikythera Mechanism was discovered on May 17th 1902, by archaeologist Valerios Stais when he was diving on the Antikythera wreck off Point Glyphadia on the Greek island of Antikythera. The wreck is believed to have sunk in the 1st century BCE and has yielded many spectacular artifacts. The most mysterious of these is the Antikythera mechanism, a solid lump of corroded bronze gears. It has taken over a century, the latest imaging technology, and decades of research from a few dedicated scholars of mechanical engineering to piece together what the mechanism did.
I put together this playlist of short youtube videos. Together they describe the latest advances in understanding the mechanism and how it worked. There are four videos that take about 20 minutes to watch. Just click on the video and all four will play.
One day towards the end of 2000 my wife finally lost her financial patience with me. As she threw a pile of bills and receipts at me she screamed “That’s it! I’m never doing the accounts or paying the bills again. You never collect receipts, you never write a memo in the checkbook, and your work expenses are impossible to understand!” She then stormed out of the room. I was in no doubt that she meant it and she has remained free of the household accounting burden ever since. I unfortunately have not. It is true that until that day I had never balanced a checkbook in my life and habitually threw bank balances in the trash without even opening them – another contributing factor to my wife’s rant. In fact the only time I ever knew my bank balance was when the ATM refused to dispense cash. So it was with great trepidation that I began my fiscally responsible life. I figured that as I had designed and built large financial software systems I ought to be able to use a small one. So after some research I selected Quicken as they had a large share of the market and had a version for Mac.
Today Vannevar Bush (rhymes with achiever) is often remembered for his July 1945 Atlantic Monthly article As We May Think in which he describes a hypothetical machine called a Memex. This machine contained a large indexed store of information and allowed a user to navigate through the store using a system similar to hypertext links. At the time of writing his essay Bush knew more about the state of technology development in the US than almost any other person. During the war, he was Roosevelt’s chief adviser on military research. He was responsible for many war time research projects including Radar, the Atomic Bomb, and the development of early Computers. If anyone should ever have been capable of predicting the future it was Vannevar Bush in 1945. He is an almost unprecedented test case for the art of prediction. Unlike almost anyone else before or since Bush was actually in possession of ALL the facts – as only the head of technology research in a country at war could be.
When the history of early software development is written it will be a travesty. Few historians will have the ability, and even fewer the inclination, to learn long dead programming languages. History will be derived from the documentation not the source code. Alan Turings perplexed, hand written annotation “How did this happen?” on a cutting of Autocode taped into his note book will remain a mystery.
What kind of bug would stump Alan Turing? Was it merely a typo that took a few hours to find? a simple mistake maybe? Or did the discipline of the machine expose a fundamental misconception and thereby teach him a lesson? The only way to know would be to learn Autocode.
A few months ago I had to setup a home office and decided I would take the opportunity to upgrade my home network. My Linksys BEFSR41 Etherfast Cable / DSL Router had never given me any problems and so I decided to upgrade to the Linksys BEFW11S4 Wireless-B broadband Router. I now have everything working reliably but getting to this happy state and resolving the problems took a lot of luck and in the end the solution was far from obvious. Judging by the bad reviews on Amazon and elsewhere it appears that many people have been unable to fix similar problems with this device. Below is my description of the problem and a solution that worked for me. Hopefully this will help others, but as always, your mileage may vary!
Some time between 1934 and 1950 the first modern computer was created. Pinning down exactly when that event occured is not easy. It depends on how you define the term computer and what you think is more important: The concept, the design, the first succesful test, or the first time the machine solved a real problem. In those early days it usually took years for a team to progress from concept through design to working machine. There were many such teams working mainly in the US and UK. These teams competed and cooperated somtimes they shared ideas and designs, and they sent representatives to visit each others laboratories. On one famous occassion in the Summer of 1946 almost all the leaders in the field got together at the Moore School for an 8 week long series of lectures. In short the story of the emergence of the modern computer is a complex one that involves both direct and indirect contributions from many people.
There are many Computer History Timelines in existence. But all of these suffer from the same flaws. They are incomplete and thier linear nature fails to capture the complex web of influence that was the hallmark of computer development.
Downloadable files available here
In an effort to visualize this web of interaction. I have started to develop a graphical representation of the evolution of the modern computer. Fortunately AT&T have kindly released a package called Graphviz which is capable of drawing complex directed graphs. The graph above is produced by Graphviz from a text file.
The text file contains a detailed description of my approach, the classification I have used, and lists all the machines and the references to the data sources I used. I have not duplicated that information here because the whole point of the exercise is to gather all the data in one place.
I have licensed this file with an attribution, share alike creative commons license. So please feel free to download and improve what I have started. If you do make changes please send me a copy and I will share the updates on this page.
For the record. I believe that The Manchester Mk I Prototype was the first Computer in a modern sense. But the text file is not intended to prove this or any other machine was first. It is only intended to record the known dates and influences for computing machines designed between 1934 and 1950. I Believe that the graph is complex enough to support many interpretations.
In 1946 between 8th July and 31st August the Moore School of Electrical Engineering at the University of Pennsylvania held a special course entitled Theory and Techniques for Design of Electronic Digital Computers. The course was organized in response to interest generated by; the schools public announcement of the ENIAC, and the publication of The First Draft of a Report on the EDVAC. 1945 by Jon von Neumann. Attendance was by invitation only and the “Students” were selected from the leading experts at the major institutions working on the development of computing devices in the US and UK. At the time of this event there were only three published designs for a stored program computer and it was expected that all those present were familiar with these documents.
Within two years of these lectures the first stored program computer was operational, within 3 years there were 5 operational machines, and within 5 years stored program machines were commercially available. The Moore School Lectures, as they became known, were responsible for focusing all the leading developers of computing devices on a single problem:- How to design and build a stored program computer. It is interesting that despite being outnumbered and out-funded the British took, and held, the lead in this development effort between 1946 and 1953. In some areas such as business applications the British held the lead for much longer. How they were able to do this is not directly explained in any of the historical material available online, which tends to focus on individual development efforts and not on the larger picture.