III Publishing

Computation as Cancer
November 27, 2019
by William P. Meyers

Site Search

Also sponsored by PeacefulJewelry

Popular pages:

U.S. War Against Asia
Fascism
Democratic Party
Republican Party
Natural Liberation

Mind Parasites

In 1900 computer was a well-established occupation. A computer was a human being who performed repetitive math calculations all day long. Rooms filled with such people solved math problems for scientists and engineers, not that different from the rooms full people adding up accounts for businesses. For the most part this required only the simplest of arithmetic skills: adding, subtracting, multiplying and dividing. The workforce had grown quite large since its inception in the early 1600s. The slide rule was becoming common for computation where approximations were a necessity, but would not hit their point of highest use until the 1960s. Mechanical adding machines were just coming into general use by 1900.

The global population in 1900 was over 1.6 billion; it had been about 1 billion in 1800. There was considerable environmental damage, including species extinctions and conversion of large natural areas to agriculture or wasteland, but nothing like today's scale. The industrial revolution had begun, mainly fueled by coal. Electricity was available in many cities in the advanced countries, but in homes was mainly used for lighting. Telegraph wires connected the world, and telephone wires were becoming more common. But when it came to computing in 1900, there just was not very much to do. In the little stores of the era, when an item was running low, a clerk would add it to the order list. No computation necessary. Pictures were taken with film, no computation necessary.

As the world enters 2020 few humans do computation with pencil and paper. Slide rules are antiques. Even the hand electronic calculators so beloved of engineers and students starting in the 1970s have been subsumed into smart phones. The world population is 7 billion or so people. Most objective people will admit the world is dying, eaten up by industry and agriculture, heating up with global warming due to greenhouse gas emissions. And by 2025 perhaps 25% of global energy use will be for computers and computation.

The invention first of vacuum tubes and then of transistors enabled the shifting of computing from human beings to electronic machines. At first only the most computation-intensive tasks were done on mainframe computers: calculations for atomic weapon design and crypting and decrypting military messages. Then more ordinary engineering uses became common, and businesses used computers to track customers lists and to speed up bookkeeping and accounting. Computers shrank in size as they grew in computing power, from mainframes to minicomputers to desktops, laptops, tablets and smartphones.

There are only so many bridges to be built by engineers. Despite the growth of human population and economies, there is only so much banking and bookkeeping to be done. But by the 1980s the cost of computing (excluding the externality of environmental and mental destruction) had been reduced to the point that frivolous uses for computing began to grow at exponential rates.

Graphics, first still graphics then video, required much more computation power and memory than mere accounting or engineering numbers. The computer manufacturers loved it. Every time the public thought computers were powerful enough, they found out they were wrong. Games sucked up power. Photo editing sucked up power, but video editing really really sucked up power.

Server farms and cloud computing sprang up like mushrooms in odd places around the world. Computation had become malignant, eating at economies and culture like glioblastoma or brain cancer. A single child playing with a video app on a cheap smartphone did more computation than a mainframe computer was capable of in 1960, or all the human computers of the world in 1900.

Multiply that by 7 billion people, most owning at least a smartphone. The cancer is killing the planet, killing the smartphone people, but they are in denial, they have not gone to a doctor, they love the cancer itself.

If any organization is to be singled out for the destruction of planet earth, perhaps we should consider the graphics organization known as SIGGRAPH (Special Interest Group on Computer Graphics), rather than the coal barons or petroleum patriarchs. The group held its first conference in 1974 in Boulder Colorado, with about 600 in attendance. It too grew like a cancer. By the 1980 conference (a year before the IBM PC was introduced)7,500 people attended. In 1984, the year the Apple Macintosh introduced computer graphics to the desktop, 20,390 people attended. The peak year was 1997, when the Los Angeles conference had 48,700 computer & entertainment geeks attending.

By the year 2000 the metastasis should have been obvious, but for most people the Internet was still to slow to transmit good quality video in real time without buffering, and most personal computers could not compute fast enough to do rapid editing of photos, much less of video. But as those capabilities grew, things that had been done with text, requiring low levels of computation, started being adorned with graphics and then video.

A lot of things contribute to global warming and environmental destruction. Video games, Instagram, and TikTok are seen as harmless, but they are aspects of a monster that grows without limit, that consumes without putting food on the table or a roof over people's heads. If people really want to save the planet, in addition to driving electric vehicles or commuting by walking, they should turn off the e-sports and most of their smartphone apps.

But if I had a time machine, I would go back and destroy that first SIGGRAPH conference.

And stop having more than one child per couple. Just stop.

III Blog list of articles