On Learning to Code. Or Not.

Alert! Jeff Atwood wrote an excellent post about the “learn to code” movement.

He starts with a tirade full of incredulity about Mayor Bloomberg’s New Years resolution to learn to code with Codeacademy.

“Fortunately, the odds of this technological flight of fancy happening – even in jest – are zero, and for good reason: the mayor of New York City will hopefully spend his time doing the job taxpayers paid him to do instead.”

Let’s put aside the princely sum of $1 that His Honor collects from the job. Let’s even put aside that Mayor Bloomberg is doing exactly what he’s supposed to be doing – promoting New York’s bustling tech industry. More to put aside: our Mayor happens to be a technology pioneer with a ridiculous IQ.

This all comes down to a very difficult question: should people learn nerdy things when they have little use for them, just for the sake of learning.

I remember a Livejournal discussion that was hashed over and over in the Russian-speaking community. A math teacher was stumped by a question from his student: why was she supposed to learn about trigonometry when she wanted to become a beautician. The teacher did not come up with a good answer, but the livejournalers did dig up some awesome reasons. One well meaning pro-education-for-the-sake-of-education zelot said something to this effect: well, if you work with nail polish, tangents and cotangents figure prominently in formulas that deal with reflectiveness of thin films. That will lead to a greater understanding of how and why nail polish looks the way it does.

On the surface it may seem that Mayor Bloomberg has about as much need to know how to code as much as a beautician needs to know about sines and cosines.

There’s more: executives who learned a little bit about writing code at some point tend to say the following phrase “oh, I don’t know much about writing code, just enough to be dangerous”. They say it with this look on their faces:

Jeff takes this further with the plumbing analogy: since almost everyone has a toilet, should everyone take a course at toiletacademy.com and spend several weeks learning plumbing?

Normally I’m against education for the sake of education. I once argued for a whole hour with a co-worker who felt that _any_ education is worth _any_ amount of money. I did not know at the time that he held degrees in Psychology of Human Sexuality, Biology, Sociology and Communications. He must have been on to something: he made an amazing career while mine took a nosedive soon after that discussion.

Here’s where Jeff is wrong (I know, this is shocking, Jeff being all wrong and such): it is better to push people to learn incongruous things then to tell them that this is a bad idea. Steve Jobs learned calligraphy in college and it turned out to be super useful. He might not have become a master calligrapher, but man, did that piece of esoteric knowledge change the world.

When I was in college I badly wanted to take a scientific glass blowing class, but did not. I deeply regret that.

Are there people who learned plumbing from This Old House annoying contractors? Yes. Are self-install refrigerator ice maker lines causing millions in water damage? Yes. Is the world better off because Richard Trethewey taught it some plumbing? Absolutely.

If anything, attempting to learn to code will make people more compassionate towards coders. I do believe that people who are not already drawn to programming are not likely to become programmers, more than that, they are not likely to sit through a whole RoR bootcamp or worse. Learn to code movement is not likely to lure in bad programmers, but it might give people some understanding of what coders go through and maybe be more hesitant to have loud yelling-on-the-phone sessions near their cubes. Mayor Bloomberg, who enforces open workspace policies everywhere he works, might understand why programmers need offices. Jeff, let His Honor code a bit.

Burying the Lead

Every time I reread my blog posts, the same thought comes to my mind – “man, I buried the lead again”.

I learned about leads from “Made to Stick: Why Some Ideas Survive and Others Die” by Chip and Dan Heath. It is a short book, but one that influenced me deeply. Every blogger out there should read it.

Burying a lead“, in the jargon of journalists means boring the reader before getting to the juicy part. A “lead” or “lede” is the first sentence of the story.

In the book, there’s an anecdote about a journalism teacher giving his students an assignment:

” … They would write the lead of a newspaper story. The teacher reeled off the facts: “Kenneth L. Peters, the principal of Beverly Hills High School, announced today that the entire school faculty will travel to Sacramento next Thursday for a colloquium in new teaching methods. Amnong the speakers will be anthropologist Margaret Mead, college president Dr. Robert Maynard Hutchins, and California governor Edmund ‘Pat’ Brown. ”

Apparently, most students produced a lead that lumped all these facts into a single sentence. The teacher read all the submissions and then announced:

“The lead to the story is ‘There will be no school next Thursday’ ”

I am having a huge problem with writing in “inverted pyramid” style. The juicy parts of my posts are usually at the bottom.

Think about it, most blog readers, especially the ones that matter suffer from add, and often do not get to the bottom of the article. This means they won’t link to it, won’t digg it.

I am trying to improve, but writing is a difficult art to master. I just wish I took more writing classes.

iWoz: Computer Geek to Cult Icon: How I Invented the Personal Computer, Co-Founded Apple, and Had Fun Doing It

The mastermind behind Apple sheds his low profile and steps forward to tell his story for the first time.

Before cell phones that fit in the palm of your hand and slim laptops that fit snugly into briefcases, computers were like strange, alien vending machines. They had cryptic switches, punch cards and pages of encoded output. But in 1975, a young engineering wizard named Steve Wozniak had an idea: What if you combined computer circuitry with a regular typewriter keyboard and a video screen? The result was the first true personal computer, the Apple I, a widely affordable machine that anyone could understand and figure out how to use.

Wozniak’s life—before and after Apple—is a “home-brew” mix of brilliant discovery and adventure, as an engineer, a concert promoter, a fifth-grade teacher, a philanthropist, and an irrepressible prankster. From the invention of the first personal computer to the rise of Apple as an industry giant, iWoz presents a no-holds-barred, rollicking, firsthand account of the humanist inventor who ignited the computer revolution. 16 pages of illustrations.

Of Wangs And Core Dumps

I started learning programming on a Soviet computer called Iskra 226, a few of which were given to our after school program by the kind Navy bureaucrats. I vividly remember finding a BASIC program already stored on the hard disk that cheerfully asked a few questions about the weather and the megatonnage of a warhead and then quickly calculated the size of the epicenter, severity of fallout and whatnot. The teacher was not amused and asked me to delete the program before anyone else had a chance to see it..

Although Iskras turned out to be less popular with other kids who preferred Soviet knockoffs of Sinclair Spectrum which had good graphics and buttloads of nice games that could be loaded from audio cassettes, I preferred the loud monochrome screened monster. You see Iscras had peripherals – a dot matrix printer that sounded like a machine gun and a humongous hard drive that sounded even louder.

Later I learned that Iscra was a clone of a Wang 2200 computer. And even later I learned a bit more about Dr. Wang’s company. So, continuing my Computer History Through Coffee Mugs Series, I present to you a prized mug from my collection:

As it turns out, Dr. An Wang also happens to be the inventor of magnetic core memory, a technology that always fascinated me. Here is a core memory plane from my collection:

Core memory stores bits by sending current to donut shaped rings of ferrite. Wikipedia article explains how this works. Early core memory arrays used a small amount of larger ferrite cores. Later ones, like the one on the above picture used buttloads of tiny little cores. From what I heard, these amazing devices were assembled by third world garment workers. By hand. Under microscopes. If you have any doubt that this is true, take a look at these close-up shots that clearly show that this is done by hand:

Jay Dubya Zee shed some light on how horrible is the job of people who assemble camouflage nets. Think about how much worse is doing something like this:

How much ram is this you might ask? The back of the card holds a label. It says:
Lockheed Electronics Company, Inc.
Data Products Division
Core Memory 8k x 18
2001002326-1A1 HK022

These days core memory is still used in aircraft and spacecraft because it keeps the information when power is off and is supposedly less prone to radiation.

The word wang these days mostly means “penis”, a common name of a Chinese restaurant, is used on t-shirts, as a sentence enhancer or just at random. Also, unexplicably, “wing-wang” is another name for a dollar.

Memory dump files are called “core dumps” to this day because of core memory. Also it is common to refer to core dumps of dilithium and chockolatium.

The World Is Your Spitoon

BoingBoing writers don’t seem to be able to shut up about betel lately. This reminded me of a 4th or 5th grade report on India (great friend of the Soviet Union, emerging economy, blah, blah) that I had to do in school back in the Soviet times.

I remember the teacher get very interested about betel chewing and prematurely praise it as a great habit. Then I reminded her that importing such a thing would mean having to deal with bright red spit all over. It’s not like the Soviet Union did not have its own share of hygienically questionable customs.

Back then Soviet sci-fi writers promised us Communism with goods being teleported right into our crystal palaces for free from anywhere on the globe. I guess that (and my flying car) did not work out, but today Capitalism brings us the ability to order almost anything from almost anywhere through electronic computers.

So, if I want to try some betel all I need to do is pick between The Basement Shaman and Shaman Palace or many other fine merchants.

Who knew that there are so many stores catering to shamans. At leats now I know where the Suburban Shaman.

Hmm, I guess I could get one of these and try to rid my cubicle of the sick building syndrome.

Dream Blog : Destroyer Of Worlds Or Darn Usability

Here’s a dream I had recently:

A girl that was partially my wife and partially somebody else got a hold of an incantation that could destroy the world. She pronounced it and world destruction began. My former English teacher uploaded an “antidote” function into my Powershot G3. I tried to execute the function. I kept pressing buttons and scrolling through menus, but could not find it. A popup window (which happened in the air, not on camera’s screen) gave me an ominous warning “World destruction in progress. Now only elementary math functions and the contents of this room remain”. The lens and the electronics of my camera were gone, leaving only a shell with buttons. Even though a bit of time remained, I could no longer access the menus. And that’s when I woke up.

Interesting, this is at least a second dream with a camera that refuses to work.

The Legend Of Darius McCollum

I remember reading in papers about a 15 or 16 year old train obsessed kid who faked his way into signing out an MTA train and driving it for a long stretch only to be caught after an automatic switch disabled the train due to speeding. For some reason I thought that the story happened in the early nineties, but it looks like it actually happened much later. I also remember the kid was not punished too strongly and had a chance to work for the MTA.

I always wondered about what happened to him. And as it turned out instead of getting a job at the MTA Darius McCollum had an amazing career impersonating MTA workers and ended up getting a 5 year prison sentence recently.

There was a big long article in Harper’s Magazine about all this:

Before leaving his girlfriend’s apartment in Crown Heights, on the morning of his nineteenth arrest for impersonating and performing the functions of New York City Transit Authority employees, Darius McCollum put on an NYCTA subway conductor’s uniform and reflector vest. Over his feet he pulled transit-issue boots with lace guards and soles designed to withstand third-rail jolts.”

Ooooh, I want those boots.

Darius spent hundreds of hours watching trains at 179th Street. He estimated the angle of every track intersection in the yard. By the time he was eight, he could visualize the entire New York City subway system. (Later he memorized the architecture of the stations.)

That’s heavy duty Asperger’s for you.

“By this time Darius had cultivated a constellation of admirers at the 179th Street yard. Darius has always been deeply disarming. His charm resides in his peculiar intelligence, his perpetual receptivity to transporting delight, and his strange, self-endangering indifference to the consequences of his enthusiasm. Darius never curses. He has no regionally or culturally recognizable accent. He has a quick-to-appear, caricaturishly resonant laugh, like the laugh ascribed to Santa Claus, and he can appreciate certain comedic aspects of what he does, but he often laughs too long or when things aren’t funny, as when he mentions that he briefly worked on the LIRR route that Colin Ferguson took to slaughter commuters. Darius litters his speech with specialized vocabulary (“BIE incident,” “transverse-cab R-110”) and unusually formal phrases (“what this particular procedure entails,” “the teacher didn’t directly have any set curriculum studies”). He frequently and ingenuously uses the words “gee,” “heck,” “dog-gone,” “gosh,” and “dang.””

I actually know what “transverse-cab R-110” is. It’s one of those newer prototype trains with a full width cab.

“It is unlikely that Darius will omit the year he spent wearing an NYCTA superintendent’s shield. While he was doing a stint as a conductor, he discovered that he could have a shield made in a jewelry store. He began wearing it on a vest he pulled over his TA-specified shirt and tie. He had a hard hat and pirated I.D. Darius considered himself a track-department superintendent, so he signed out track-department vehicles and radios and drove around the city, supervising track maintenance and construction projects and responding to emergencies. “

Amazing. In fact, it looks like he did a pretty good job. But still got some hard time for it.

“”In any event,” Berkman said, “I don’t understand what the point is. … So far as I can tell there’s no treatment for Asperger’s. That is number one…. Number two, Asperger’s would not disable him from knowing that he’s not supposed to form credentials identifying him as an employee of the Transit Authority and go in and take trains or buses or vans or cars or other modes of transportation, which I gather has been his specialty…. “

And I completely agree with the judge.

Beeeep beeeep beeeep beeeep SLAP

Most people I know don’t like their sleep to be interrupted. I, on the other hand, as long as I don’t have to get up right this minute, don’t mind being woke up multiple times.

First of all, the actual process of falling asleep after quieting the harsh beep of the alarm clock is a very pleasant experience. Second, I find that a short series of naps is more refreshing than a long “wow, how long was I out” sleep. I also a series of alarms has a much greater chance of waking me up from an REM state. This is the best way to wake up: the brain is already active and the dreams can be easily recalled.

At some point I wanted to make an alarm clock that would detect either eye movements or the brain waves associated with them and wake me up during REM. Understandably, for the lack of time, skills and gumption I never got further than playing with a basic stamp microcontroller and reading EEG newsgroups. I suck.

Anyhoo, this morning, between the infamous 9 minute alarm clock buzzes, I had 2 dreams.

In the first one, came for a visit to America. We went to explore the power station at Brooklyn College. Tema had a really old looking key that opened the gate. As a side note (not a part of the dream): Brooklyn College has some very interesting infrastructure. There are tunnels connecting all buildings, a power plant, a heat plant and a buncha other interesting things. I’ve heard that there is a linear accelerator somewhere. Right. So we explored the area around the power plant a bit, I pointed out Monk parrots to .

You’d think I went clubbing with in the second dream, but I didn’t. Instead I was still in Brooklyn College. My high school English teacher was giving a lecture standing behind a podium in the middle of the quad. He said: “the time now is [don’t remember] and the temperature is 28 Therms “. I asked him is there is a thermometer on his podium that measures temperature in “Therms” ( I think a Therm is the same thing as BTU). He said that that was the case. For some reason I called him Alex, even though his name is Alan.

WML: Dude, I Am Getting a Dell

Guess what? This post is going to be about microcomputers. PCs.

I never owned a computer in the Soviet times. Not even a programmable calculator. I did have access to some old Wang clones called Iskra (Spark) in an after school program, played with a programmable calculator of a neighbour, played games on a frien’d PC, played games at my father’s friend’ work computer ( also PC), paid to play games on Sinclare computers that some enterprising people set up as a pay-per-play arcade, etc. Oh, I still remember the horror in the eyes of my teacher when I found a set of programs that calculated the level of contamination from a nuclear blast given the input of wind speed, bomb yeild and some other variables. Those Iskras were donated from the Red Navy.

In the US, my father purchased a 386 for a humongous sum of $1300. It was put together in some computer shop on avenue U. That was in 1993 or 1992, I think. Since then, I’ve been upgrading my computer on the average once every three years. I think In all, I went through 3 cases, 6 motherboards and 2 monitors (not counting my wife’s computer). I never owned a brand name computer. After the second computer I’ve learned that I could be putting together myself.

It seemed like a good idea at the time, putting together my own stuff. What could be simpler? Pop in a motherboard, a videocard, a modem, some ram, some hard drives — and you’ve got a box!

I’ve become thoroughly familiar with what cuts from a ragged computer case feel like. I’ve learned how hard it is to be without the Internet when your computer is in pieces on the ground (and a driver needed to make the new hardware run is on the Internet, of course). There are very few types of flashable hardware that I did not have to flash. I accumilated a huge collection of computer screws, cables, cards and thermal processor grease.

The questions that went through my mind were:
Why are jumpers so tiny? (these days they have jumpers with little tails that can be taken out with just fingers)

Why ide cables are so hard to deal with? (there are rounded cables available now)

Why it’s so hard to find 0th pin on the hard drive connector? (newer ide cables come with a little peg that doesn’t allow it to be put in the wrong way)

Which idiot came up with PS2 plugs? (one word – USB , well, ok, three words).


This is all slowly changing, of course, but the much bigger problem of minor factory defects and incompatibilities between chipsets still plague individually bought components.

My last self-put together box – a dual processor PIII 1000 sucks ass. I could not get a single AGP video card to work with it. An IDE raid controller that worked ok on my previous motherboard wold cause all OS to crash. And finally, two little pegs that held the cooler on the processor broke, and I can’t keep PIIIs from overheating.

I’d like to say, that after I’ve removed the raid card and put in a PCI video card, the system ran extremely steady for a year. Now it’s time to think about the future of my computers.

So my resolution is this:

1) Throw out the crappy dual processor motherboard and the crappy coolers. Buy a nice cheap and super steady single processor PIII motherboard + a stock Intel coolers and turn that computer into a file server. Four 120 Gig 5400 RPM drives (I don’t need the speed, and those drives run much cooler) should do the trick. The case of that computer is very nice and cool looking (it’s a square. It looks like this:

Maybe I’ll even make the drives removable, but so far all removable racks that I’ve tried sucked ass.

2) Buy a nice Dell workstation. That will be used for image manipulation and coding.

3) Buy a big ass LCD monitor (or maybe one of those Sony 27″ CRT monitors) for use with the workstation.

4) Buy a tablet pc for myself and a laptop for my wife.

5) Donate or sell on eBay all the crappy hardware still sitting in my drawers.

I think all the money I saved this year on rent should easily buy me this hardware.

Good, Better, the Best / Never have a rest / ‘Till Good becomes Better and / Better becomes the Best

That was a little rhyme that my English teacher used to help us remember irregular forms of adjective “best”.

In New York cops are called “New York’s Finest” and firefighters are called “New York’s Bravest”.

But let’s not forget about the lesser known adjective described services:

NYC Department of Sanitation Workers: “New York’s Strongest”
Corrections Officers : “New York’s Boldest”
TLC Enforcement Officers : “New York’s Proudest”
EMS drivers and paramedics : simply “New York’s Best”.

And they are all exactly that, and more.