Stupid Graph Tricks

My pal Esther, who I hope hires me again someday soon, tweeted about Indeed.com, a site that tracks job listings over multiple sites and allows you to graph trends in relative and absolute terms. (She was lamenting the rise of squooshy terms like “social media” over terms like “editor” and “writer”–though anyone who’s been involved with “social media” knows there is a desperate paucity of writers and editors out there.)

So, I did my own research–and after determining that nobody anywhere ever needs the skills I have–I went further afield, as encapsulated in the graph below:

As you can see, job opportunities for “sex”, “drugs” and “rock and roll” have been on the rise over the past four years. What’s fascinating–and by fascinating, I mean utterly meaningless–is how despite the way drug demands rise and fall, and sex demands spike then plateau, and rock and roll rises steadily, they all pretty much go up at the same speed.

So, I guess this means putting “sex, drugs and rock and roll” could only improve your job prospects.

Or maybe I’m not the sort of person who should be using these tools.

Don’t Be Eeeeeeeevil!

From the fevered dreams of a madman department (via Instawhatsis): Michael S. Malone posits that Google’s new browser “Chrome” is stealth bomb (stealth bomb? Let it go, I’m on a roll) in their silent war to CONTROL THE WORLD’s data.

There’s actually a rebuttal from a guy AT Google that of the “stealthy” point. I heard about it third hand, from someone who was annoyed by all the other people telling him about it. I downloaded it and–it’s interesting. I think it’s probably a look at the next evolution in browser design. It’s seriously uncluttered.

But of course I realized, in doing so, that this was going to be an entrée into gathering more data on us. Duh. That’s where Google makes its money. They are looking to control a lot and they make no secret of it. They’re counting on the organization/mining abilities they give you will compensate for lack of privacy.

There were similar issues with Gmail. “Oh, no! Google is going to give you all the space in the world but they’re going to pay for it by reading your e-mail!” Well, yeah. But they’re not judging you when they do it.

Of course, no one is actually reading your private e-mail. Don’t flatter yourself. Nobody cares. There’s just an algorithm, like the one that checks for spam, only this one checks for advertising keywords. Besides, don’t you know the rule of not sending anything over e-mail you don’t want the whole world to read?

Should we watch out for Google? Sure, it’s smart to be aware of any company that has such a huge influence on the world, ‘net or otherwise. And–most people don’t realize this–their company rule is not “Don’t Be Evil” it’s “Don’t Be Eeeeeeeeevil.”

So, there’s some slack there.

Pizza & Programmers

In an article over at CIO, Esther Schindler wonders about the magic of pizza (and a few other food items) in getting programmers to work overtime. There are some noteworthy things about this.

  1. I should probably be reading CIO more.
  2. Esther, who’s an old pal of mine, is both seriously cute, and seriously technical.
  3. Actually related to the article: Programmers and other IT geeks love what they do.

Right now, what do I do? I design and code software for various purposes, and build special purpose Set-Top Boxes for people (like TiVo on steroids).

If I had all the money in the world, what would I do? Deisgn and code software for various purposes, and build even cooler STBs for people. I’d probably work the content angle harder, too–besides super-powered media devices, people need ways to get unfettered content–and that takes a lot of money, or at least more than I have. (For example, I could build or buy a cable network. That kind of money.)

But the point is, I’d be doing almost exactly the same thing that I’m doing now. It’s like Office Space: “What would you do if you had a million bucks? Apart from two chicks at the same time.” I’d need about two million, I think, what with the family to support, but maybe with the investment stuff I’ve learned I could get by on less. (Heh. Get by on less than a million? Can’t be done!)

I know programmers better than other classes of Information Technology types, and I’ve never known one worth a damn that didn’t spent considerable amounts of their free time working on other (non-work) projects.

To give you a personal example, I was heavily into music in my teens and early 20s (as many of us are), and played guitar, keyboards, etc. I also wrote a program that allowed me to create musical scores. This was before there were many of these, sure, but even now, I might pick up one of the modern tools and dislike something about it, and my reaction would be to write my own. That’s what programming geeks do.

I waxed a bit on D&D in an earlier post this week: I can’t tell you how many D&D related programs I wrote in my youth, other than “a lot”.

In my martial arts years, I wrote a program to manage tournaments. It was awesome. Anyone who’s ever been to a karate tournament in particular, and probably most sporting events, knows how poorly organized they are. My sensei told me to slow things up because I was moving people through so fast, there wasn’t enough time to sell them concessions (which is a big revenue generator at tournaments).

This really isn’t all that unusual among craftsmen. I have a friend who’s a master woodworker. He might go to Ikea to pick up a lawn chair, if he really doesn’t care about it. But most likely, he’s gonna build what he wants, exactly as he wants it, to exactly suit his needs. That’s what he does.

So, back to the pizza point: Basically, we’re going to be coding because that’s what we love to do. But we don’t like to be taken advantage of any more than anyone else, though we’re probably less aware of it. Little, consistent gestures, such as pizza, sodas, snacks, oddball breaks–other stuff I outline in my CIO article–all tend to reflect an appreciation.

There’s an ego issue, too. We all have war stories. Overtime is part of the culture, and if you’re being productive, is its own kind of bliss. Think not of teamsters for whom overtime is an excuse to bump the paycheck, think of Michelangelo slaving over the Sistine Chapel. I once stayed at a job, ooh, eight months longer than I knew I should have because I wanted to finish the project.

There was huge, huge stress in my life as a result. I ended up out of work right about the time the tech bubble burst. Hell, I ended up in court as a result. But I’d been nursing some ideas about–well, something really, really technical–and had designed all kinds of theories around this thing, and I was just so, so close to testing them out.

This led to some of the worst times in my life and yet, if I had to do it all over again, I might just.
I ended up proving to my satisfaction the feasibility and limitations of the ideas, and developed a system that was gratifyingly high-performance and low-cost. (Smarter now, I hope, I’d probably just ditch the whole situation and applied the theories elsewhere.)

Note Google’s extremely clever “free day”. A job where I’m encouraged to pursue some wild hare? Oh, yeah!

So, it’s not a really big mystery. It’s just the threshold for getting geeks to work overtime is lower. You can get a lot of mileage handing out cash, and raises and bonuses (bonii!) sure don’t hurt, but pizza can make the office a nicer place to be, without necessitating budgetary oversight, etc.

Two things execs don’t understand about geeks: 1) What they do or can do, so they often just completely misuse them; 2) That they love doing it, and really need oversight the other way. (At Melissa & Doug’s toy company, for example, they pretty much make you go home at six, I’m told.)

I’ve made the comparison before to musicians–you can almost always get musicians to work for free food–and it’s apt. Although, of course, musicians are more likely to actually be starving.

Cogs In The Machine

Here’s an interesting bit on hiring called “Hunting the 5-pound Butterfly”.

This is interesting to me because my resumé is ridiculous. I’ve been pruning it for years because listing all the tools I’ve worked with is absurd. It does get me some interesting job interviews but seldom any actual jobs. And I’ve lost out on jobs because I had a specific subset of skills, but wouldn’t commit to having experience on a finer point that I was sure my interviewer didn’t actually understand.

Yet I consider my experience a fraction of my worth. And perhaps this goes back to being easily bored. Even if employment goes on for years, any given job should, in my view, be temporary. In other words, if I’m hired to do X, I want to do X in such a way that it doesn’t have to be done again, or in such a way that someone else can do it most of the time. Then I go on to do Y.

Often Y is something my employer never imagined me doing. Sometimes it’s something they didn’t imagine could be done. (I’m not suggesting any wizardry on my part, more a lack of imagination about how computers can be applied to problems.)

But 99% of employers–no matter how obsessed they may be with never ever bearing the cost for improving the skills of their employers–would not hire me (or anyone) on that basis. Nor on the basis that my free time tends to find me developing skills that will prove handy in my job. (My current boss is rather canny in that regard: He’ll hire someone with a kick-ass work ethic to do something they’re expert in, even though the project is relatively short-term. He knows his person will transition easily to new tasks.)

It’s the cog-in-the-machine mentality. Employers want to hire the perfect fit even though they don’t really know what that is. They think they do, for sure. Because they also figure that when job A is over, and job B pops up, they’ll fire the guy doing job A and hire a new perfect fit for job B. And of course, at no premium.

One might think that the dynamism of the past 30 years would have educated employers: Adaptability is the key element for any employee. Adaptability in the form of a willingness to learn and master change.

But I guess I’m biased here.

Another Non-Story: Low Unemployment = Workers Demand More


Via Slashdot.

NetworkWorld is running an article on the challenges of retaining young workers. Seems Johnny Y doesn’t want to work at Initech and expects to be well compensated for his skills and time.

So, the job market has equalized since the Tech Bubble burst, unemployment stays at historical lows (sub 5%), and, huh, tech people are demanding more out of work. Money, perks, decent working environment.

I’m gestating a theory that we are all Dory. Our capacity, as a group, to remember things farther back than just a few months seems to be severely limited.

For example, I bought a house in 1997. By the peak of the real estate frenzy, it had tripled in value. In the past year-and-a-half it has fallen 25% of its peak price. It’s still overpriced. If it settles at double in the next year or two–well, I’ll probably still think it’s overpriced.

The whole time this was going on, any rational person could see that it couldn’t possibly last. At least around here, we go through this cycle every, oh, 15-20 years? (I’m hoping to make a jump when the market crashes again: It’s not the price of the house but the taxes that are the killer.)

Of course, bets are off if the government gets involved. It can screw things up in some remarkably persistent ways. Combine bloated government with an ossified industry (used to using the government as a crutch) and, well, you end up with this.

The tech industry is just a remarkable series of Leonard moments. Tech industry takes off in the mid-‘90s and I get lured out of my starving writer lifestyle to make a ton of cash. Businesses start complaining about being able to hire workers–often using impossible job criteria–and push heavily on the H1B. People seem to be irritated that tech skills pay so well. A bunch of unqualified gold-rushers jump into the field and really screw things up.

Then the tech market crashes. Tech jobs start paying a lot less. (Businesses still moan about not being able to hire qualified workers.) Workers complain about the devaluation of their skills. A lot of gold-rushers (and good people, too) bail out. Those that remain hang on to their jobs while they can, hoping not be forced out by a market flooded with competition willing to work on the cheap.

But IT work is like plumbing, electrical work, road building–it has to be done. Businesses can’t run without it and be competitive. (The remarkable inefficiencies I see every day–and the unwillingness of businesses to change to eliminate those inefficiencies–still surprises me.) As the economy recovers, tech workers get choosier–and businesses complain about being able to hire (or in this case, keep hired) qualified workers.

All this–all of it–is just detail level supply and demand. Unlike my grandfather’s generation, I’ve never worked for anyone with the expectation that they keep employing me. (There was one exception, where I was repeatedly told otherwise, which should have set off my radar.) When I was younger, I would happily jump from contract to contract for more money (though I’ve always had a rule about not leaving work undone).

This is a good thing: I learned a lot about how companies work, about the value of a good, competent boss, and the dangers of incompetent ones, about the difference between liking someone and trusting them, about friendly work environments versus well-organized ones, where money fits into all this, and so on.

Those going out for the first time into the workforce today should realize that all things must pass, economies are elastic, and that gathering your rosebuds while ye may is not necessarily a bad plan. Try not to bitch about it, though, when your value drops. It’s not personal.

And realize, no matter what, employers (as a group) are going to always bitch that you’re not available enough, that you’re not grateful enough, that you want too much money or too much influence.

It’s as natural as forgetting what happened yesterday.

On Programming Language Popularity

My old pal and CIO editor Esther Schindler has written a blog entry with the deliberately inflammatory title of “Is Computer Language Popularity Important?”

Well, you gotta drive traffic somehow.

And it is a good, if hoary, question. While Esther follows up the question of which language is “best” by adding “best for what?”, it should be fairly obvious that the continuation of “Is popularity important” is “important to whom?”

Before going into that, though, let’s pick some points (hey, it’s almost a fisking):

At the developer level, language choice is completely personal. Like picking a brand of hammer or carving knife, the tool should fit well in your hand. Some people think better in Python than in PHP, and God help anyone who thinks in LISP.

The tool analogy is probably the only one available but it’s just not a very good one. If I use CodeGear’s JBuilder to program a web-based front-end in Java, the language doesn’t really parallel with the brand–the brand (Codegear JBuilder) does.

So, is the language the tool, then? No, that’s not a really good analogy either. The environment (JBuilder’s IDE, debugger, etc.) is really the tool. That’s what you’re using to put your materials together with. And therein lies a clue: The language is sort of like the materials–it’s the medium in which you build. (The platform comes in to play as well, like a foundation.)

I doubt anyone would consider the materials used in building as being “completely personal”. But since human-written code is all mulched down in to the same zeroes and ones, there is an interchangeability you get as far as the final product goes.

But even there, the analogy falls apart because code seldom dies. Code is a factory you use to produce software. For any real project, you’ll spend the vast majority of your time going back to the factory to retool the product.

There might be a political correctness to saying the decision is completely personal, but no coder believes that. There are quantifiable differences between toolsets and languages. Otherwise, “best for what?” would make no sense as a question.

Programming language adoption follows the genetic rules behind duck imprinting: the first creature a duckling sees is perceived as its parent. My first programming languages were PL/1 and FORTRAN, and despite writing production code in several others, from standards like C and REXX to obscure languages like Mary2. Yet, I can write FORTRAN code in any language.

A lot of people believe this. And I suppose it must be true for some people. They tell me they think using language, so I believe them. When I tell them I don’t think using language–I think using thoughts that I then translate into language–they often don’t believe me.

Esther and I used to hang out on Compuserve’s Computer Language Magazine Forum with such luminaries as Jeff Duntemann and David Gerrold. (Just to name two: That forum was a freaking Algonquin Round Table of programming genius. It was there I learned what little humility I have.) If memory serves, David was on the language side–but he’s also a professional writer of quality sci-fi, so you know, he’s got that goin’ for him.

When I program, I think of what I want to accomplish–the end result–and then I organize it using whatever conceptual tools I have handy (often graph paper, if I need to write stuff down), and then (and only then) do I code it.

I learned, in order: Basic, Assembler, PL/I, Pascal, C, Smalltalk, Rexx, C++, Eiffel, Java…well, and it gets blurry after this point, but I’ve used most of the common scripting languages (PHP, Perl, Python, Javascript, Ruby) as well as C#. I occasionally still reluctantly do macros in Basic, but I spent most of my seven years programming Basic hoping for something better. (PL/I was great but I sure didn’t have it on my 8086.) I’ve probably done more Pascal programming in the past 20 years than any other language but I surely don’t think in it. (At least, not until the last possible moment.) But I’ve gone through periods where I’ve done nothing but REXX or C# or whatever. (Right now I’m primarily doing Delphi-style Pascal, Squeak-flavored Smalltalk, SQL, Flash, and Javascript.)

In fact, to my mind, a good language is one you don’t think about much. Smalltalk versus, say, C++, with its subtlety and ambiguities.

Smalltalk has probably been the biggest influence on me, and I had been programming for 10+ years before I learned it. (But it’s also not just a language, so it can change the way you see the game.)

How do you decide which languages are “acceptable” in your shop? Is it because your favored development environment (such as Eclipse or .NET) builds in support for one language suite? (Thus shops choosing Visual Studio would bless C#, VB.NET, and occasionally C++, with a smattering of IronPython accepted.) Or do you look for what programming languages are “in” and thus it’s easy to hire developers affordably?

So, now we’re getting to the “whom” in the popularity question. She’s primarily talking about employers. For most programmers, they almost couldn’t care how popular their languages are. For me, I want any language I use to be popular enough that if I have a need for a particular piece of code, I don’t have to write it if I don’t want to, I can find it on the web.

(Obviously this is tongue-in-cheek, but wouldn’t that be great: Only write code when you want to or when it would make you rich.)

As for when I’m hiring, or have a say in hiring, I never look for programming languages. I look for aptitude and flexibility. In most cases, I’d rather take a great programmer and teach them a new language, then a mediocre programmer who’s familiar with a particular paradigm.

I could say a lot more than this but that’s enough for now.