“Life was mostly unpleasant, brutish, and short; the legal status of women in the UK or US was lower than it is in Iran today: politics was by any modern standard horribly corrupt and dominated by authoritarian psychopaths and inbred hereditary aristocrats: it was a priest-ridden era that had barely climbed out of the age of witch-burning, and bigotry and discrimination were ever popular sports: for most of the population starvation was an ever-present threat. I could continue at length. It’s the world that bequeathed us the adjective “Dickensian”, that gave us a fully worked example of the evils of a libertarian minarchist state, and that provoked Marx to write his great consolatory fantasy epic, The Communist Manifesto. It’s the world that gave birth to the horrors of the Modern, and to the mass movements that built pyramids of skulls to mark the triumph of the will. It was a vile, oppressive, poverty-stricken and debased world and we should shed no tears for its passing.”—Charlie Stross describing the Victorian Era
But then I think about the types of machines that were considered computers thirty years ago, like my beloved Commodore 64, or the Apple IIes at school and at the library, or the monochrome IBM PCs used in businesses back in the day, and you have to ask yourself, can you do more with an iPad or iPhone than you can on these machines?
I think the answer is obvious. Just from the specs alone, phones and tablets have magnitudes-of-order faster CPUs, more RAM, and more storage space than their 8-bit counterparts from yesteryear.
One might argue that one thing you can do on an 8-bit computer that you can’t on a phone or tablet is actually write code, but this is not really true. For one thing, some machines (like the Atari 400) didn’t even have BASIC or a machine language monitor built in—you had to buy a separate cartridge. But even on other machines, BASIC and machine language were all you had.
But, if anything, the development model for phone and tablet software is similar to the 8-bit days. You often wrote your code and compiled it on a machine that had way more power than the machine the code was actually targeted to.
I think the definition of a computer has always been fairly inclusive. Just because 8-bit microcomputers weren’t as powerful as massive mainframes didn’t mean they weren’t computers. And just because phones and tablets aren’t as powerful as notebooks and desktops doesn’t mean they aren’t computers either.
The reason I’m skeptical about the problem of overpopulation and Malthusian apocalypses is because nature doesn’t really seem to work that way. It’s like economists and their ilk look at the textbook growth curve of bacteria http://en.wikipedia.org/wiki/Stationary_phase_(biology) and fixate on the exponential growth phase and worry only about the death phase, completely ignoring the steady-state equilibrium of the stationary phase. But the stationary phase can be a long time. There’s evidence that the stationary phase is probably the state that bacteria usually are in http://www.ncbi.nlm.nih.gov/pubmed/16415927 — while granted, humanity is still in the exponential growth phase, I think there are signs that this is finally starting to level off — declining birth rates, decreasing economic prosperity. I think we’re coming to a threshold. But it doesn’t spell our immediate doom. It just means the old paradigms won’t work anymore and we have to figure out ways to live sustainably. The death phase will come eventually, but as far as we can tell, this is true for the entire universe, and we can’t escape that.
“To summarize: it is a well-known fact that those people who must want to rule people are, ipso facto, those least suited to do it. To summarize the summary: anyone who is capable of getting themselves made President should on no account be allowed to do the job. To summarize the summary of the summary: people are a problem.”—Douglas Adams