Tactile Trauma

I can remember, back when I was a young-adult, to reading a science fiction novel/novella, probable an ACE or some similar, whose central character/hero/protagonist had been raised in a fundamentalist – I forget whether religionist or naturist – enclave and now living in general society was unable to use its “advanced” technology without breaking it. 

I used to have a secondary report who was such a Komputer Klutz that we had to have three computers for him. (This was back in the DOS/early Winders days.) One on his desk, and two in or in transit to/from the shop. He was knowledgeable of computers, for someone who had to learn coding in the workplace, but never really made the transition from main frame to PC.

This is all a lead in to a rant: I Hate Touchscreens! I get bitten by them at least once a day. This morning it was my MP3 player at the gym. Used to we could buy good MP3 players that we could hack and get to do what we wanted and they had real button switches. Nowadays you can’t get MP3 players that aren’t almost useless and exorbitantly priced. This one is of this genre.

But I also get bit by my cellular telephone and my slablets. All running ANDROID which is saved from being the planet’s worst OS only by the continued existence of MegaHard and Winders.

I also am not really happy with keyboards and mice. The keyboards are expensive – Blue Switches – and between my fumble fingers and the spontaneous generation of crud under the key-caps, I get mistypes. Which I can usually – almost always – fix without a hard reboot. Similarly, my mousing dexterity is never up to what seems to being an ever decreasing pointer sweet spot on screen. And my mouse pad keeps shrinking.

But I reserve my hatred for touch screens.

Internet Survival

Two Day. Off to gym and with this being holiday season for the rest of the English Speaking nations of the planet, I tried a new podcast, the Guardian’s “Chips with Everything.”[Link] Since this is British I feel I need to remind the locals that these have nothing to do with Lay’s, Golden Flake, or Paul Bryant. It is supposed to be the Guardian’s take on society and technology. I downloaded a four part series on the internet as a sample.

After listening to two, of four, episodes I have to admit to being underwhelmed. The disenchantment started when one of the journalists claimed to be expert on the internet. Not that such don’t exist, but not journalists. Journalists, even British ones, are like teachers. They know a lot about journalism (teaching) but have very little content knowledge. This is particularly the situation with STEM stuff. A good article has maybe 0.5 of its information accurate and trustable. The rest is stercus. And that’s a good journalist. The bad ones satisfy Sturgeon’s Rule.

So I early got into distrust mode since the – later demonstrated – talker claimed the highly improbable. Which was fulfilled by some of the crap uttered.

I was particularly nauseated by the statement that the slablet in the pocket gave one access to all the information they would ever need. 

Stercus! First of all there are lots of things that aren’t on the internet, or can’t be accessed well with a slablet. More importantly, what is missing here is knowledge. Just because you own an encyclopedia doesn’t mean you know and understand it.

The series was motivated by the recent UN declaration that internet access is a basic human right. This raised the question of whether the internet is really a survival thing. The obvious answer is a resounding NO! So the edict is a bit specious.

And if it weren’t I could happily relate as to how the Yankee Republic is a totalitarian state for denying access to something like a third of the population.

Bue we already knew it was such, didn’t we?

Digital Direction

Six Day. Running a bit behind, mostly due to slow cognition. Anyway, ran across this article [Link] entitled “This Guy Made a Replica of The Computer That Helped the Robinson Family 50 Years Ago” yesterday and my thinking was led to the picture more than the effort of the individual.

Seems the fellow made a replica of an 1950 Burroughs box (mainframe?) that was used in numerous television programs as the computer character. This led me to consider what we should be doing with computers.

(The above is not, I believe, the replica because the article has an embedded video and I don’t need the overhead that goes with.) This is what we used to call a “Geblinkenen Kluge”. 

By the time I got to university, these Rube Goldbergs were pretty much replaced by metal pastel hued boxes on isolation flooring in frigid glass walled rooms. Some of them had as much as 64Kb of RAM. Hard drives were the size of refrigerators and cost as much as a college education. And you coded in FORTRAN (nerds) or COBOL (bog-geeks.) What we did was number crunching. The nerds were doing science and engineering stuff and we spent more time studying numerical methods than languages. The bog-geeks were doing accounting or inventory stuff and I have no idea what they studied since it was taught in the Business schule which was on a side of campus I only visited in summer term to watch old movies. The only things controlled by computers in those days were terminals and plotters.

In 1984, I bought my first box, an IBM PC with an 8086 CPU, 64Kb RAM (16 Kb on the MB) and two floppy disk drives. In 1986, after two years of budget whining, I got funded to buy a (generic) PC for half the folks in our organization. I had to have a blessing from the IT Tsar of the post, which took over a year, and I can recall him asking me “What can you do with one of these toys that a terminal off my mainframe can’t do better?” I answered “Are you going to let me run a word-processing program on your mainframe?” At which point he turned to his deputy and asked “What the H**l is a word-processing program?”

Once told the answer he cursed bluely for five minutes while trying frantically to approve my purchase.

Nothing, in my opinion, symbolizes the PC so much as word processing. Even more than spread-sheeting, word processing – not coding – is the epitome of the PC stage of computing. 

In those days, most STEM graduates learned to code in college. Their bosses either on-the-job or not-at-all. That was the peak of computer literacy. Since MegaHard took over the cardiovascular system of the corporate organization, it has been downhill to planned illiteracy.

So today, in the slablet age, computers have gone from tools to appliances, providing either entertainment or controlling our machines. Number crunching is arcana. Coding is a blue collar craft, if that.

So where did we go wrong?

Podcast Poo

One day. Back to gym. A bit atypical. I listened to the last half of “Linux Luddites” episode # 82 [Link] instead of an episode of “Best of Ideas”. I rationalize this based on the absurd Canadian imitation of British holiday durations. 

Anyway, one of the speakers made the announcement that he sometimes did not use a box on a given day because he could do what was needed on his cellular telephone.

I mentally upchucked at that point.

The model of the cellular telephone is the Star Trek communicator, not the Mote In God’s Eye handputer. 

Handputers do not work. I know. I have owned a couple going back to the late ’70’s. They are nice toys with limited, but more than a calculator, functionality. But they aren’t boxes.

Nor is a slablet. In fact, it’s a mediocre communicator.

If I think what I do on both box and slablet, it comes down to email. And none of the email apps available – that I have tried – are more than 10 dB less than Thunderbird. I can maybe read email on my slablet. In practice, what I can do is ‘hawg’ the inbox so that when I get back to the box I don’t have to get rid of the trash as much. 

But that’s about it. I can’t do spreadsheets or write code or compose articles or any of the other things I do on a box. Because the screen is too small, the I/O is too poor, and keying is one atom up from impossible. And it’s slow. Horribly slow. And no floating point math processor. And I can’t imagine trying to graph stuff on the slablet. Heck, it will barely work as a conventional phone.

So please quit telling me that I can do box work with a slablet. It’s a social lie. Quit or I shall dose your coffee with Phenolthalein. And be done with you for a while. Not that I expect you are capable of learning. From the experience or otherwise.


Despite what my colleague Normal Angular Momentum claims, sometimes our mistakes surprise us. They fall into a category I label as wait-long-enough-and-you’ll-do-it-again. One such is being a chair – general, program, … – of a conference. There was a period in my life that I got mousetrapped into doing such. I found the long-enough was about five years. That was how long it took me to forget the pain and suffering enough to get talked into being a chair again. 

Anyway, the one that is the subject of this blot is Canonical and their tribe of ‘buntu. Canonical, with the help of competent volunteers that they are perpetually micturating, maintains a group of Linux distros/versions differing primarily in their GUI-Desktop. They all share at least some of the bedrock of a somewhat mutilated version of Debian.

The thing that bites is their over-the-internet updates. These occur every six months for most versions releases but every two years for the corporate long-term-support versions. Years ago they offered two ways to update: a downloadable ISO that one burned to a DVD/CD; and an over-the-web stream. Over a period of years I learned that the probability of failure in upgrading via the first method was very low, o(0.01) while the probability of failure using the over-the-web method was almost exactly 0.5. 

The level of my deep OS surgery skills is so low that when an upgrade fails I have to reinstall from scratch. That’s what comes of my priorities and I accept the burden. But I still rebel at Canonical’s negligence in this matter. I have run lots of other distro and none of them have failure probabilities greater than o(0.03). So when it comes to upgrade failure Canonical is the Linux equivalent of Alibam.

But I have this OLD Dell Latitude D420 lapbox that I like because it has a goon volume and mass and keyboard – unlike all more recent lapboxes except maybe the Lenovo just mentioned, I it had version 12.04 of a ‘buntu variant on it and I was a bit concerned of replacing.

So I did an update to 14.04 yesterday. Slow but successful.

Did an upgrade to 16.04 this morning. Slow and catastrophic.

Absolutely modal sampling, isn’t it?

Anyway, so I loaded the 32b version of LMDE on a stick, which is one of the few existing contemporary distros supporting 32b and non-PAE – that is, OLD boxes – and had it up and going in thirty minutes. It’s a bit slow but so was 14.04 and I can live with. And enjoy another five years of repulsion and horror for Canonical.

Stercus for OS

Winders has really become a crappy OS.

Last week I ordered a refurbed Lenovo lapbox off Woot. Came with WX. Specs far in excess of price, as expected with refurb.

Got into the box yesterday. Wasted two hours trying to get WX to set itself up and then convince it that it would let be get to the “BIOS”.

Finally rebooted and noticed a brief message. Rebooted again with ESC key downheld and was rewarded with access to “BIOS”. A bit of recon revealed Lenovo had already enable the box for dual boot.

I suddenly identified with how Nailand Smith must have felt admiring the devious Oriental mind of Fu Manchu. 

Spent ten minutes yanking out the WX HD and replacing with a shiny new SSHD I had bought on sale. Meanwhile putting ISO on stick. Then fifteen minutes from inserting stick, downpushing power button, and doing one time Boot order change to finished install of SolydK on the box.

And then an hour doing updates for the period since the ISO was made. But that’s not a downer. 

Timed the second reboot. Seventeen seconds. 

Winders has really become a crappy OS.

Self-destruction technology?

Two day. Gym was VERY sparse. Podcasts were sciencey. None particularly memorable. The Guardian Science podcast dealt with some London conference on tactile simulants. That is, how can feel be implemented in virtual reality? They have a name for this that doesn’t stick largely, I think, because the obvious name – tactics – is already taken by the military and is so ingrained in most humans that double purposing will be useless.

The bestest (?) way to do this would be direct brain stimulation. Then you wouldn’t need any mechanical devices except the stimulator and maybe some sort of insulation from “real” reality. 

But all of this begs the question of why are we so bent on running away from reality? Is it because whatever we think reality is, we can’t cope with its negative sides. So we invent new reality.

For most of the two million year history of genus homo, we existed in the reality of Nature. Technology was limited to what we could build using locally accessible materials quickly. And then either discard or carry with us. The extent of possessions was determined by what we were willing to carry over and above necessities. And we existed largely at the whim of Nature. If the weather turned bad we either found shelter quickly or endured it. If someone got injured they died quickly or healed well enough quickly. If someone got sick either they got well or everyone died. If food was scarce everyone either went without or died. 

Notice the common thread? So somewhere along a few thousand years ago we tried sedentary living so we could have more stuff. A not have to hustle for shelter. But when the population overwhelmed us since more children were living because they weren’t underfed by tired mothers and dragged about so much, we invented agriculture. And then civilization. And as near as we can tell that was a lot more miserable than being hunter-gatherers, so we invented different forms of entertainment. Escapism. 

This wasn’t new. There was story telling and discussion – once we invented language – around the fire as hunter-gatherers, but that was mostly for the purpose of keeping the fire lit and the predators at bay. This new form was to divert us from the nastiness of our new social reality. 

So the inference is that we keep inventing new forms of ways of denying the nasty bits of our new reality. And then we invent new nasty bits.

Hence, virtual reality?