With CoreOS, developers can now easily spin up Docker containers on DigitalOcean

Featured Image -- 4310


Awesome! Will be a great way to easily create FuseESB services on the cheap.

Originally posted on Gigaom:

DigitalOcean will announce Friday that it is now supporting Linux OS specialist CoreOS, giving developers an easier way to deploy Docker containers on its platform.

Users will now have the option to choose CoreOS as the base image that acts as the foundation for spinning up new servers; all a user has to do is create a droplet — a combination of compute, storage and networking resources — chose the region and location of the server, and then click on a CoreOS icon that will trigger the server’s launch, explained Mitch Wainer, DigitalOcean’s co-founder and chief marketing officer.

[company]CoreOS[/company] has made a name for itself in recent months as a supplier of a custom-built Linux OS that can power Docker containers. With CoreOS running on [company]DigitalOcean[/company], users should be able to run Docker on DigitalOcean while also taking advantage of CoreOS’s cluster management service and server-patching capabilities, said CoreOS…

View original 111 more words

Freakonomics, Regulate This!

Some people create and innovate, some people follow the innovators, and some do everything they can to keep innovation down. This is a great episode of Freakonomics Radio.

Freakonomics Radio: Regulate This! 

Computer Programming, High School

To argue that computer programming should be a required high school course is absurd. But I’ve learned that many high schools still don’t offer any kind of computer programming/computer science classes. This is surprising to me, because even my high school, a mostly rural school with children of blue collar families and farmers, offered Computer Programming I and II (and before the days of Visual Studio).

I haven’t been able to find much data on the subject, but this Washington Post article states that few than 1 in 10 high schools across the state offer computer science courses:

Across the Washington region’s school systems, fewer than one in 10 high school students took computer science this academic year, according to district data.

I imagine the statistics are similar in other states. Also, one must wonder what is considered “computer science.” A high school near my house teachers Microsoft Powerpoint, Word and Publisher classes. This I find as surprising as the fact that most schools don’t offer computer science classes at all. Computer science, as far as code design, data structures, methodologies, algorithms, is hardly something of rapid change. These fundamentals are the basics that should be learned before any student delves into the specifics of a language. The Microsoft Office Suite, on the other hand, is little more than a set of common, user-friendly tools, guaranteed to be a version or two out of date in the span of four years! Why waste time teaching such things?

I have read that finding qualified teachers to take on computer sciences courses is a challenge. This makes sense, as any skilled engineer would probably rather be earning three times the income of a high school teacher.

Maybe this is an area where the local engineering community could step up. Why not let a good engineer cut away for an hour a day to do some community service by teaching a class? It would be a great way to help high schools–a win-win. The high schools would get a qualified teacher for a specialized class, and the business community would nurture future engineers. This idea seems so obvious that I can’t imagine it isn’t already being tried somewhere!

On Publishing and Writing and Documenting


I was telling my daughter the other evening that it is important to know how to spell, and just as important to know how to write (and write well). She’s going into 5th grade, so such a lecture may be a bit premature. No worries. This lecture will be a repetitive one.

As I recall, the conversation came up when I made note of how impressed I was that she was reading a book just for fun on summer break. Reading–reading real books by real authors with real editors–is the best way to learn the craft. And for whatever reason, it seems that fiction is the best place to learn to write. (This is an opinion, but I have no problem asserting it as undeniable truth.) The technical stuff–most of it–is dry and full of poor writing. I’ve stumbled across countless examples of verbiage in techie books and articles that would make any English professor curse.

I love fiction. Shouldn’t we all? I read much more fiction than non. It would be easy to argue that the skills used to convey well-written fiction have little, if any crossover into a technical career. Don’t say it! (If you already said it, take it back!)

On the contrary, I’ve found that reading quality fiction is applicable my professional career, and its application comes in a form of learning that requires little deliberate uptake. So long as a story holds my attention, reading is easy and fun, and the learning and improving grammar is a secondary, seemingly osmotic benefit. While lost in a great story, we build upon vocabulary and command of language. We see styles and methods that we like. These we recall. We see styles that seem awkward or boring. These we recall as well (and avoid).

I cannot name a single professional career in which the ability to write is not of importance. I’d be interested in hearing of another opinion on this matter. It may sound absurd, as though I am declaring that simple arithmetic is important. But if this is such an obvious assertion, why does it seem that so many communications are written with bold indifference toward basic English?

Lest I get too far ahead of myself, I should come clean with something. The other day I sent my boss a very short email, typed on my iPhone, as I hurried to get things situated at home. The email, I thought, was short and to the point:

Running a little late this morning. In by 9:30.

(Micky is not my boss’s actual name, for the record.) The text above is what I meant to write. It is not what was sent. When I arrived to work, “a little late,” as I had told my boss, he had already had a good laugh at my email. Here’s what I actually wrote:

Rubbing a little late this morning. In by 9:30.

Rubbing. There it was… In my boss’s inbox forever. Foiled by auto-correct again!

Auto-correct is great for very short things, but only when used with caution. In this case it wasn’t a big deal (or was it?). Aside from the hilarity, my boss knew what I meant, and he guessed that auto-correct created the error. As silly as this example may be, it does leave a certain aftertaste, does it not? I’m a technical person, paid for focus and attention to detail. I write software… Software that needs to work the right way every time. While the example above is chuckle-worthy, it is also cringe-worthy. I know better!


Publishing articles here and there has been fun. There’s something extremely satisfying about seeing your name in print. (See how I used two adjectives in that last sentence? Don’t do that in a technical document. Ever.) There is a certain thrill in having your words read by an audience. But with this comes a great amount of work and time commitment, which I haven’t consistently had the bandwidth for. To be sure, writing an article for a trade publication must be a labor of love, as the payout for such things is right around $0.00.

It isn’t all that difficult to get an article published, and it may be something worth seeking, if not regularly, at least once. The key is simply to have a specific goal for the article and a certain amount of knowledge of the subject (along with a willingness to put in some research where necessary). When it comes to publishing in a trade publication or website, it is also important to spend time self editing. Most of these outlets don’t edit much (or very well). It is embarrassing to see a typo make its way through to the end product. Even worse, reading a sentence or entire paragraph that you wish you had revised can be infuriating.

Infuriating may be a strong word… Or not. Personally, when reading anything, be it a book, article, or blog post, I lose focus and interest when I become annoyed by an author’s lack of writing skill. And while I am fairly confident that even this post will have a sentence or two that should be reworked, I stand by this assertion!

Keep the grammar simple. If you aren’t sure whether a particular sentence structure works, don’t use it. Lofty language is for poets, not for those of us attempting to convey a point. If a may set aside humility for a moment, the most consistently positive feedback I’ve received over the years has been with regard to my ability to write. What’s the secret here? Nothing, really, at least not that I can pinpoint. (See what I did there? I used a fragment. I love fragments. They are concise. And clever.)

A few things when it comes to software development and writing:

  • The developer who actually takes the time to document, from commenting code to precise checkin comments to keeping the wiki clean, already has a leg up.
  • Don’t write to impress. Chances are you will fail. Write to communicate.
  • Keep it simple, grammatically clean, and to the point… “Pithy,” as Bill O’Reilly likes to say. As I mentioned before, if you believe something doesn’t make sense, rewrite it in a way that you would like to read it.
  • There is little need to editorialize when documenting software. It may be necessary for certain things, but only in small doses.
  • Documentation is often treated as something that is done once and forgotten. This is a problem.
  • We all like to insert humor into our work lives. This isn’t a bad thing. But if there is any chance your words will be read by a customer or outsider of any kind, don’t do it. The words you use may find their way to people considering your company as a partner, client, or provider. Given the fact that the things engineers write may escape the scrutiny of corporate leadership, we should inject professionalism early on.
  • Have you ever sent a document out to the team for feedback and peer review? When you don’t get any responses (and you probably won’t), don’t assume that this means your document is perfect. It isn’t.
  • The Oxford Comma is the subject of some amount of debate among literary nerds. I think its use in technical documentation is appropriate.
  • Semicolons suck. Too strong? Okay, fine. Semicolons are way overused! This doesn’t have much to do with technical documentation, per se. I just couldn’t help but editorialize a bit.

I am hitting the Publish button now. I have not re-read or edited this post. Shame on me!

Fiber to Wake Forest Residents (Hopefully)

Wake Forest: Public Meeting Sheds Light on ‘Fiber to the Forest’


Hopefully many people show interest and this can actually happen soon.

Some Random Links

Swift – Not really new news by now, but I’m looking forward to having some time to tinker with it.

Chordify – Give it an mp3 or URL, see chords, play along. Really cool.

Oracle vs. Google vs. Java – This is still going on?

Why Choose Jenkins (over Hudson)

Browser Usage Statisics — Chrome is clearly in the lead. So why do we all continue to bend over backward to support Internet Explorer quirks? Probably because so many corporations insist on it.


Moore’s Law Plateauing? — While I’m skeptical, do people really pay that much attention to processor speed any more? Increasing persistent storage speed along with the leaps forward from SSD seems to be our greatest performance gain these days… If the cost ever comes down.

Patent Troll Targeting Podcasts — Why isn’t this getting more attention?

What’s The Value of College?

CNN: Surging College Tuition
CNN: Surging college costs price out middle class

Not long ago I found myself working alongside a brilliant college dropout–A young junior programmer who was just plain gifted when it came to software development. I was very surprised that he hadn’t completed a degree of any kind. It made me wonder why I had, without much consideration, put such high value on a four-year degree.

A recent InfoWorld article, 15 hot programming trends — and 15 going cold, touches on the issue of rising tuition costs and the questionable value that they bring.

I attended Ball State University — a place hardly known for being an engineering college. It was a nearby school with a Computer Science program that did not cost as much as IU or Purdue. For someone like me, it was attainable. While I enjoyed my time at Ball State, and I learned much, very little of my Computer Science education turned out to be directly applicable to my career. Sure, I learned formal concepts, design practices and perhaps a little about requirements gathering and QA (very little). Some of what I thought I knew had to be unlearned, as I came to realize that things operate differently in the “real world.”

Ultimately, as someone with an inherent interest in writing software, I suspect that everything I really needed to know could have been learned in a year of dedicated study. The rest comes from workplace experience.

The problem, of course, is that if I hadn’t gone the college route, spending 4+ years working on a Bachelor’s degree, I would never have been able to land my first job interview. And it was that first job where I really learned how all this stuff that I knew really came together in a real business environment on a project of significant size.

Through the years, I’ve met great, good and awful software engineers with varying backgrounds and educations. Many of the best software developers attended college, but graduated with a degree in something unrelated (History, Art, New Testament Studies, English, to name a few). These people gravitated to Software Engineering and Development through various means, some of them going on to pursue certifications and other training.

My experience hardly reflects any kind of comprehensive analysis, but I don’t hesitate to say that most of the software engineers with undergraduate degrees in non-CS fields are among those that I consider excellent.

There was a time, not all that long ago, when droves of students gravitated to Computer Science because they heard that it was a great career to pursue. While I happen to agree that it is a great career, I don’t think it is a career for just anyone. It requires a certain type of interest and motivation. Perhaps it is because some folks enter Computer Science undergraduate programs for the wrong reasons, but I have observed all ranges of skill level from those with CS backgrounds. I’ve found myself shocked (more than a few times) by the poor quality of code created by developers with formal CS educations. I once was asked to help debug some code written by a colleague that had compilation problems. It didn’t take long to find the problme: A 2,000+ lines-of-code function that caused the compiler to choke.

Doctors, Teachers, Lawyers, Accountants–These are all people who require specific degrees and certifications. I know that I don’t want to have my eyes checked by a self-trained Optometrist. In software fields it is different. After a software engineer has some experience, it seems that his or her degree becomes afterthought. Unless the subject of college comes up during a lunch conversation, rarely do I actually know the formal education or degree of a colleague. What I do know is that person’s quality and volume of work. Don’t get me wrong–there are things that may be taught in a Computer Science department that are absolutely necessary. Knowledge of algorithms and design patterns is important. It should be noted, however, that knowledge and application are different beasts.

I wonder–If college costs keep rising at such a staggering rate, at what point does the return on investment lose its worth? With companies hard pressed to find good software engineers, and with a greater percentage of the population unable to afford a 4 year degree at even a semi-respected university, when will the traditional model change? There are so many options–from certifications to local technical schools that are available at a fraction of the cost. At some point it seems that a college degree becomes more of a social status symbol than a true reflection of one’s talent or ability.

We’ll have to begin to ask ourselves: Which candidate is right for the job? Is it the one fresh out of college with a CS degree and a 3.8 GPA who lacks experience working with others on a project of scale, or is it the non-college-route self-taught programmer who has proven talent that can be seen by way of open-source contributions?

Occasionally I have seen job postings for software engineers which claim to require a Master’s Degree in Computer Science. I have to wonder: What does the hiring manager believe he or she might get from the engineer with a Master’s Degree that differs from the engineer with a lowly Bachelor’s Degree? In my experience, most Master’s Programs a little more than the same programs that undergraduates complete… The only difference is that the students in the program have already completed a four-year degree (and that degree could be anything).

This isn’t to demean formal education. If I had it to do over, I wouldn’t change my time at Ball State University. No way! I was fortunate, however. When I went to Ball State, college was merely ridiculously expensive. Today it is insanely expensive. In 10 years, it will be unattainably expensive. When that happens, where will the software engineers come from?

High Output = High Output of Defects Too (So go easy on ‘em!)

ImageWe all know that software defects are pretty much inevitable. (Right?)

A bad developer may produce a fraction as many defects of a great developer. I’m guessing Linux Torvalds has written more bugs that I could write in my entire career… And yet, to say that this means I am a better software developer would be absurd.

Inherent in this fact is an easy truth to understand but a difficult truth to respond to: The developer with lower output of overall contribution likely has a lower volume of defects tied to his or her name (or commits), while the superstar developer may be put in that less-than-fun situation of accepting responsibility for a defect a little more frequently than anyone would like.

The obvious reason for this is the simple fact that with great output of success comes a greater output of defects (even though, percentage-wise, the great developer may create significantly fewer). A less obvious reason for this is that the great developer–the super-talented one whose volume of output is consistently astonishing–is much more isolated from peer-review of other teammates. Those who don’t understand or work at a pace behind that of the great developer are less apt to offer insightful peer review. Also, the most talented developer on the team may be the most demanded upon (and when we believe in Software Ninjas, we’re setting ourselves up for problems).

There may be good reason to go easy on the guy or gal who must humbly own up to that defect that made its way into production. If fingers must be pointed, its the entire time that should be reviewing the work of their peers.

SSH User Annoyance & Solution

I’m in an environment where whenever I ssh to a machine I have a different username than that of my main machine. For example, the username on my desktop of “Some.Desktop.User,” whereas all the Linux environments I ssh to use the username “Some.Linux.User.” I’ve typed “ssh <host>” countless times, only to be annoyed when I realize that the password I am being prompted for is for “Some.Linux.User,” which does not exist on the host. Of course I should have typed “ssh <host> -l Some.Linux.User.”

To make life a little easier, do this:

In ~/.ssh create a file named config. In that file add the following:

Host *
User Some.Linux.User

Likewise, if you have a number of different accounts on different servers, you can do something like this:

Host servername.domain
User Some.Linux.User.1

Not exactly a super secret tip, but a useful time saver.

Obligatory Social Media Advice to the ‘Younguns”

Social-Media-IconsI’ve seen a few good posts and articles that make attempts to explain to young ladies—teens and preteens—the consequences of their posts online. Many of these are very good, with many great points. This is my own attempt. Don’t get me wrong: I love social media. I’m a fan.

Thirteen to eighteen can be strange years of life—this is nothing new. What is new is the technology available, and the way that a single poor choice can do to ruin a reputation. During these years, kids have the ability and know-how to make choices with lasting consequences. They do not always have the wisdom that should be used with such powerful, lasting tools at hand.

I remember being a teenager. (There I go… I sound old with one short sentence.) It wasn’t THAT long ago! I also remember saying things like, “I don’t care what people think about me!” and “As long as I’m true to myself, that should be all that matters.” Neither such statement is, was, or has ever been true. Every teenager who has ever said such things (most of them) thinks he or she is the first one to come up with such an idea. Also, every teenager who ever said this actually does care what others think.

It doesn’t change in adulthood. I care what others think of me. You bet I do! I care what my boss, my family, my neighbors, and my colleagues think about me. Why wouldn’t I? I want them to think that I am loyal, hard working, and kind. Heck, I want them to think that I am smart and handsome! I may be neither, but the desire is certainly there.

While it is important that we don’t allow unfair criticism to impact our personal values and sense of self worth, there is always good reason to take heed of what others may think: Reputation.

I have daughters—not quite to the age where this is yet a problem—but its coming, and its coming very soon. Unlike many parents, I’m fortunate enough to be tech savvy. On the computers in my house, and on my wi-fi, there will be little if anything that goes on without my knowledge. I know how to block certain sites and how to track access. I know how to obtain passwords. Will I snoop on my daughters? You bet I will!

I’m also up on the current trends, so I guess I’m lucky enough to know all about Twitter, Instagram, Snapchat, and Vine and Ask.fm. Not long ago I set up an Ask.fm account to see what it was all about. I was bothered—repulsed, actually–by the creepiness of it. If ever there was a tool to allow strange folks with malicious motives to stalk young people, this is it.

In our house my wife and I have established a rule, and to be fair (at least in the eyes of naïve children), this rule will work both ways: No password are to be hidden from one-another. Other than surprises–i.e. gifts–there is never a good reason to keep secrets from immediate family. Sure, they’re young, but it is important to establish such rules early on. Dropping a new rule on a kid when he or she turns a certain age is not likely to go over very well.

To some this may seem like an extreme move. Even the most involved parents that I know sometimes say that children need their privacy. I certainly agree that I don’t need to be breathing down their necks at every turn (nor could I). When it comes to one’s online presence, however, this is necessary. Too many parents have no idea what their kids are doing on the phones, iPods, or tablets that they received for Christmas. Some think that simply disallowing Facebook and Twitter accounts is sufficient, without knowing that the kids have already moved away from these social networks in favor of others that the parents aren’t hip to.

I’m not foolish (well, not always).  Monitoring devices and network access is far from foolproof. At some point there will be nothing I can do to ensure that my children aren’t doing something stupid by way of social networks or text messages. The best I can do is explain the possible consequences and teach them to be smart, safe, and be aware of their reputation.

I’m not the first, nor will I be the last, to say this: A single picture, a single bad text message, or a single bad post can do much to solidify one’s reputation. It can do a great deal to lock in the type of person you are perceived to be—And perception is everything! The unfortunate thing about a reputation is that the negative is much more easily locked in by a few bad choices, while it takes much diligence to maintain a good reputation. We’ve all made bad decisions, but until recently, such bad decisions weren’t so easily disseminated to the masses.

Others have probably heard this advice: Before posting or sending something, ask yourself, “Is this something I would want my grandmother to see?” (Or how about “Would I want my teacher to see this?” There’s a good chance—a VERY good chance—that one of your teachers will see it.)

The above is good advice. Great advice, actually. Unfortunately, most teenagers aren’t thinking of this question as a prompt of appropriateness. They are thinking of the near-term benefits, along with the perception that the “old people” are clueless (old, in their minds, being anyone over the age of 25). (I’ll say it again: If your parents don’t, your teachers can and will see your posts.)

As I peruse Instagram, I am sometimes stunned by the amount of junk that kids post, and the lack of any self-filtering used. In the case of young ladies, it seems that many of them are eager to get likes on their pictures, as they try their best to look attractive and appeal to the boys at their schools.

The duck-lipped-smoochie faces are everywhere. The bikini-selfies, posted with some sort of idea that, while a young lady just happens to be wearing her swimsuit, she really just wanted to post a picture of where she was at—or her new hairstyle—or a new gadget is not the exception. It is the norm. I suspect that many such pictures are posted with the idea that plausible deniability is along the lines of, “Oh, I didn’t realize people would see it that way.” (Yeah, right.) Maybe its true. Maybe the ever-trusting mother will say, “Oh honey, you didn’t know better, but some people may see that the wrong way.”

Here comes the standard disclaimer (I tried not to, but alas, I must): I’m not a prude, but…

What? You don’t believe me? I’m not a prude! Really!

Many, many people—countless more than a kid can imagine—see these pictures. It doesn’t matter what privacy settings may happen to be. Any image can be copied and passed along. I’m reminded of how infuriated Tommy Lee and Pamela Anderson were when their now-famous home video became public. The best way to avoid having it ever seen would have been to not create it in the first place.

With social media, the situation is a little different. The people posting these pictures want them to be seen, at least by some audience. As we all know, however, when it comes to anything on the Internet (including text messages), no such posts will ever be as private as one may hope. I’m sure that telling teenagers to refrain from instant messaging isn’t very easy these days. And even the best, most involved parent cannot look over a kid’s shoulder all day long.

Anyway, here are a few things I would like to point out to any young ladies who may happen across this post. I could say all the things that any of us heard growing up: Trust me, I’ve been there…, I know more than you think I know…, Adults aren’t as stupid as you think…, This is for your own good…, What would (insert relative’s name here) think of that picture?

I hope, for the sake of my own children and for the sake of others, that some amount of this advice becomes commonplace enough that tweens and teens begin to take heed.

Nothing is Private

I know that you think you and your boyfriend will be together forever. You won’t be. When you break up, do you really want him to have that picture you sent? If he does have any such pictures, you can be fairly certain that all of his friends do as well. If he hasn’t passed the picture along to his buddies, he will when the two of you break up.

That picture you sent to the love of your life a year ago will go viral. Maybe not to the degree of Charlie Bit My Finger, but to a much greater degree than you ever imagined or desired.

The Grandma Test

Here’s a better question to ask yourself, even better than asking if you’d want your grandmother to see it: Would you want your future husband and children to see it? Long after middle school and high school, after all the silly mistakes that we make along the way, and after those blunders (you hope) are forgotten, the Internet doesn’t forget. You can try to delete your accounts and remove old posts, but it’s all backed up and cached somewhere. More importantly, you cannot delete the accounts and posts of others. Once that picture or video you sent is out of your hands, it’s out of your hands for good. Sometimes you’ll be on a job interview. Do you really want to wonder if a potential employer happened across that thing from years ago? (Employers Google candidates. Its true. They do.)

You know what you’re doing–Admit it!

You posted a picture of yourself with the caption “I love my new hairstyle!” It just “so happened” that you were standing braless, or in a bikini, when you posted that picture. Maybe 2 guys from your school pressed the like button. Maybe 10. Maybe 100. You’re secretly thrilled, because you know—admit it—that they aren’t liking your hair. Teenage boys can be vile, filthy creatures. On this I will say it without reservation: TRUST ME!

Now that you have posted the picture, for all to see, and you got a like from Billy Smith, that cute guy in your math class, you may have some sense of pride. This is what you were seeking in the first place, right?

What if Billy asks you to the semiformal dance? You are going to meet his parents—His mom and dad, maybe even grandparents—who may or may not have their own Instagram accounts (or maybe they check his phone messages—and maybe he doesn’t even know it). Do you really want his mother’s first and only impression of you to be as that girl who posts and sends provocative pictures of herself? I’m sure the duck-lips that you made while taking the picture will be the least of her concerns.

As for Billy, what are his thoughts about you? Is he someone who likes you because you are funny, pretty, and charming? Or has Billy gotten the idea that you’re exactly the kind of girl who is going to satisfy his teenage-boy urges?

Billy will never admit it, but he’s thinking of pretty much one thing and one thing only. I’ll let you in in case I’m being too vague: Billy thinks that any girl who would post a picture of herself wearing next to nothing, and get away with it, probably has parents who will allow her to stay out late and do whatever. And Billy is fairly certain that his semiformal date has pretty much already said yes to his planned advances.

Maybe you’re okay with Billy having such ideas. Maybe that’s the kind of young lady you are. If you aren’t my daughter, it isn’t my place to scold you. But what if you aren’t ok with this? What if you’re just a young lady who, like most teenagers, wants to know that she’s pretty. And maybe that bikini pose that resulted in100 likes on Instagram provide that kind of boost to your self-esteem.

I’m not ridiculing you. We all want to be liked. We all want to be attractive! I love it when I post a family picture on Facebook and someone comments, “What a great looking family!” I’m no different. There’s a certain satisfaction in being complimented in such a way.

→ Back to Haunt

Here’s another question: How is your self esteem going to be when Billy dumps you a few days later, and shares that private picture you sent to him while getting ready for the dance to all of his friends? (Hint: It’s going to be in bad shape.)

Or what if Billy turns out to be a downright nasty dude, far worse than you could have imagined, and he follows up with a comment on that Instagram picture you posted with a detailed account of what happened after your dance? It could be all lies, but the people seeing such words along with a promiscuous picture are going to assume that its all true. Even before you have a chance to delete his awful comment, it has already been seen by your friends, family, and others. This is far from embarrassing—It’s devastating.

“But Billy would never do such a thing!”

Are you sure?

The Creeps

Let’s not forget about the creepy old men out there. It’s tragic, but true. There are some very bad people in the world, and your lack of experience may lead you to a strange conversation. You’re young. You assume the best. Maybe those kind messages you received lead you to go meet this guy. He told you he’s in college, and that he just turned 19. He even sent some pictures to prove it! Yikes. This goes beyond ruining your reputation. It could ruin your life. It could wreck a family. I’m not trying to play up the possibilities for the sake of creating some worst-case, only-in-the-movies scenario. The reality here is frightening enough. Don’t ever so much as even consider meeting someone in person who solicited you on the Internet. Please.


Finally, bullying. I’ve seen it myself, and sometimes from surprising sources. It might seem funny to post a picture of that zit-faced nerd from you biology class. Maybe the number of comments you get from such a post, along with the “hilarious” comments, offers some sort of thrill. It puts you on a pedestal, somehow bumping you up a bit on the cruel social ladder of high school. You’re way-cool. You’re in the cool crowd! You’re social status has put you in a position of high-school pecking order above the nerds! I could go on and on about bullying. There’s something about the detachment of a post that makes those who wouldn’t typically bully in-person behave much differently. It makes bullying easy, but no less cruel.

That nerd, he’s a real person with real emotions. He’s more than a hash-tag. He’s seen the post, and now we must suffer through all the hurtful comments. His offense? Nothing that he ever had any control over. Maybe he already felt bad enough about the crummy clothes he has to wear. Maybe his family cannot afford anything better than thrift store purchases. Maybe his mother passed away, and his father is doing the best he can, and the nerd, let’s call him Jimmy, has enough difficulty in his life without being publicly ridiculed for little more than not fitting in.

Is the emotional dismantling of Jimmy something that you want to be a part of? You may not realize it yet. You won’t know if for years to come, but that boy you made fun of may turn out to be the best marriage material out there. Jimmy-the-nerd may turn out to be the nicest young man you never knew. Be nice to Jimmy. Sit with him at lunch. He may turn out to be an amazing young man.


I know all of this sounds a bit surly. It probably even sounds like the foolish voice of some “old guy” who doesn’t get it. It’s worth saying, in any case. Apologies for targeting the females out there. All of the above advice is equally appropriate for you men, but I’m a father with daughters.

Where’s the Alternative OS?

windows-8-logoAs much as I love Linux (Fedora is my distro of choice), I remain frustrated that no company–at least none that I know of–is doing much to really compete with Windows or Apple. Linux, in my humble opinion, as a desktop OS, remains something usable by the few who enjoy getting their hands dirty with an OS–those that have technical expertise beyond that of the average user.

Windows has had a death grip on corporations and schools (from elementary through university) for many years now–basically as long as computers have been showing up in schools. That grip isn’t something that will likely go away any time soon. By supporting a vast array of hardware, obtaining a vast majority of users, and continuing to improve (with a few backsteps like Windows ME and Vista), Microsoft has done what others have failed to do. Apple is great, sure. I am a fan–but the company has chosen to not even attempt the kind of hardware support and third-party distribution that Microsoft has.

Sure, there were the early days when the Apple //e had some foothold in the classroom, but that is ancient history. Some colleges have labs with Macs, and many college students use Macbooks. Any college with a Computer Science program of any value uses some sort of Unix, but these are the techie computers, often restricted to a lab where only the software folks go.

I’ve tinkered

with Windows 8 a few times now, most recently attempting to help my mother with hew new Windows phone. I have to confess, I was impressed with that Windows phone. It had all the clean usage of an iPhone without the clutter of Android. The interface is intuitive. Its good. Very good!

Windows 8 on the desktop–now that’s a different story. I’m not alone when in saying that the Windows 8 desktop experience is awful. It isn’t simply because its a paradigm shift from what I am used to. Personally, I think Microsoft’s big mistake here was in assuming that the same paradigm used for tablets and phones could translate to the desktop.

Several months ago I attempted to help my sister with her new Windows 8 computer. She wanted to transfer some files and set up Outlook Express to pull email from a Gmail account. Basic stuff… It should have been really easy. I wrestled with the interface for a couple of hours, and in the end, I never succeeded in getting her email set up right.

It is almost embarrassing to admit, because something like this is the most basic of tasks. It should have been straightforward, easy. I understand that Windows 8.1 addresses some of the user interface complaints, but this was prior to its release. It was difficult to navigate without a touchscreen. It was difficult to find file locations. Heck, it took me a long time to simply figure out how to make a shortcut! And finding where a file is saved–yuck.

I’m not one of those anti-Microsoft-no-matter-what people, but Windows 8 on the desktop has really left a bad taste in my mouth. It is a far cry from being a developer’s OS, and the few people I know who have bought new computers with Windows 8 installed have been extremely frustrated by it.

Chromebooks seems to be picking up in sales, but these small, stripped-down machines offer only basic features. They aren’t for gamers or power users. They are designed to be inexpensive laptops to meet the most rudimentary needs of the average computer user.

Tux, the Linux penguin

Over 20 years have passed since the introduction of Linux. There have been countless distros, and more continue to be created. Red Hat proved that an open-source company can be a success. Most servers run Linux. Ubuntu has made great strides in creating a distribution that comes close to being usable by the average non-techie. But Linux for the average user remains something that is often discussed but never achieved.

As for Apple–I love my Macbook. I think it is the finest computer I’ve ever touched. OS X is an amazing desktop OS, but it comes at a cost. When a consumer can buy a similarly powered Windows laptop for a fraction of the cost of a Mac, there is little chance that Apple will take over in the desktop/laptop market. Sure, that $2,000 Macbook costs a lot more, but rest assured that the device is extremely well built and will last for well over 5 years. I cannot say for sure what the average lifespan of a Dell, Toshiba, HP or Sony Windows laptop is, but I do know that when people pay $400 for a laptop, it is an antiquated, slow device from the get-go. When it comes to buying a computer, the massive difference in price is pretty much the decision maker.

Even with the public frustration with Windows 8, Chromebooks, Macbooks, iMacs, and Linux desktops seem to have made only a small dent in the sales of Windows computers. (The one place where Windows 8 is great–the phone and tablet market–is, ironically, the are where Microsoft continues to fail to pick up steam.)

I’m not Microsoft-bashing. I really liked Windows 7. It presented a clean, understandable UI. It worked–very well. It was a better OS than Windows Vista in every way. So why Microsoft would abandon the desktop interface that they spent years making the masses comfortable with stumps me.

It seems that given the current state of affairs, with most people being flummoxed by Windows 8 on the desktop, that Red Hat, Ubuntu or even Google is poised to release a great operating system that really competes for the attention of the average computer users. So where is it?

Google seems to be hung up on getting users tied in to their entire Google+ network of stuff. Chromebooks pretty much require a Google account, which ties in to just about everything under the sun. Chromebooks seem to me to be little more than a physical connection to Google+. They aren’t designed as devices that can stand alone without a network connection. Gaming? Fuggetaboutit!

Sure, Ubuntu has made strides in making its Linux distro usable by the average user, but it isn’t all the way there. Most users don’t know what sudo apt-get does, nor do they wish to. As for Fedora–I love it–as do many fellow geeks–but one must be very comfortable with Linux to even think about using it. It doesn’t pass the mom testnot by a long-shot.

As far as Apple goes–They’ll never open their OS to other vendors. It just isn’t how they do things. One of the reasons Mac computers are so stable is because they have the advantage of controlling everything inside, from hardware to software. Microsoft has, and has always had, the challenge of supporting everything. I wouldn’t expect to build a computer, picking out a motherboard, processor, memory, hard drive, hooking it all up, and install OS X with any success. Windows, on the other hand, has to support this expectation.

Linux offers a great amount of power and control to the user–perhaps a bit too much, when it comes to creating a desktop OS for the masses. There is no consistent user interface. There are countless desktops and configurations, but nothing unified. While the Linux community sees this as a positive, the average user requires an interface that is consistent from computer to computer. Linux is free and open–always has been, always will be. In some strange way, the most positive aspects of Linux are detrimental to its usefulness by a wide range of people on the desktop.

Love it or hate it, I don’t think Windows 8 is going to be the downfall of Microsoft or its overwhelming hold on the desktop market. There is no real competition. Corporations aren’t going to switch to Linux or Apple overnight. They may not upgrade to Windows 8 any time soon. Why would they? There is comfort in knowing that the behemoth company behind the computers that a company invests in will survive to continue support and enhancement. As successful as Ubuntu has been, does anyone really think the average C-level executive has even heard of it?

I was thinking about creating my own list of things that need to happen before a Linux based OS can compete for real market share, but others have already done it. Here are a few links I found. (For the record, I don’t presume that it would have to be a Linux OS to offer Microsoft real competition.)

I’d love to see Red Hat get serious in the desktop market. Now is the time to do so.

ZDNet – 5 Things Desktop Linux has to do to beat Windows 8
udemy: Ubuntu vs. Windows
Could this be the year of the Linux desktop?
12 ways Windows 8 dominates the OS competition

The Art of Editing, or Should Writers Use the Singular “They”?


I found this post when I noticed a link to my blog. Good post on the commonly used singular use of ‘they’.

The Art of Editing: The Art of Editing, or Should Writers Use the Singular “They”?
Singular Nouns with Plural Pronouns

Originally posted on change it up editing:

ID-100144779I recently completed line editing a dystopian novel. After going through my edits, the author wrote to me with several questions, prefacing them with this statement:

“I made the mistake of not pestering my last editor on details like these. I’m not making that mistake again.”

He was absolutely correct to question something he didn’t understand, and I assured him that I would answer any queries he had. After all, how can writers improve their writing if they write in a vacuum?

One of his questions concerned pronouns and antecedents:

I’ve read about the use and acceptance of gender-neutral pronouns. I prefer gender-neutral pronouns when I talk. You seem to be correcting against the use of gender-neutral pronouns in my writing. May I ask why? Is the world about to go to war over this? I really wish it wasn’t an issue, but apparently it still is. Does using gender-neutral…

View original 1,170 more words

Met a Real-Live Author Today

What’s a Real-Live Author? I suppose many of us, even those of us who fancy ourselves wannabe writers, tend to think of authors as the people with books that are published by Real-Big Publishers. In more generous terms, an author is probably anyone who writes stuff. Such a definition, of course, is a little feel-good. I’ve written a number of articles for magazines that have ISSN numbers and copyrights and contracts with a bunch of words. So I don’t hesitate to call myself a writer. I write. I’ve been paid for it (albeit in negligible amounts).

I’ve often wondered if I have the chops to have any fiction work published. Today at TEDxRaleigh I was pleased to see a Real-Live Author: Daniel Wallace. I had to cut out of the conference a bit early. I didn’t want to bother the guy, but when I saw him standing there I knew I had to ask say hi and ask a couple of questions. He couldn’t have been more friendly, and my questions, likely similar to questions he gets very often, were answered with genuine interest and sincere advice. There’s something satisfying about meeting a person who has had extreme success (Big Fish–a movie most people know–is based on his novel of the same title) who is just a real person.

My questions, naturally, had to do with how to get fiction published. Silly question, perhaps. I’m sure all published novelists get such questions frequently. Perhaps constantly. In any case, his advice was seemed excellent.

Code Puzzles

This is kind of fun… In a very software nerd way.

StackExchange: Programming Puzzles and Code Golf

“New Math”

I’ve heard about new math for a long time now, but only recently have I been impacted by it. This evening I was attempting to help my daughter with division homework. The best way to explain the frustration we both endured is with an example. New doesn’t always mean improved. Old ways of doing things tend to be old because they have worked very well for a long time.

New (Confusing) Math

On Writing Tickets (Part 1)

Image(Right-Sizing Tickets)

Spoiler alert: I’m going to tell you right up front what my conclusion is: Tickets should be large in scope. Also, tickets should be medium in scope. Finally, tickets should be very small in scope. Tickets, tickets, tickets! Tickets for everything!

Imagine a ticket with the instructions “create user login.” Cringing? Me too. But most of us are familiar with tickets of huge scope that lack any kind of breakdown.

Back in the day, way back long, long ago (maybe not that long ago) we sent emails. “Hey Matt,” the email might read, “don’t forget to add the flim-flam to the doohickey. But don’t do this until the thingamabob layer is complete.”

We had defect tracking systems, but generally these were utilized for one of two purposes: Customer reporting of software issues, or quality assurance reporting of pre-release defects. Requirements were implemented by way of tracing through a document and checking items off of a list.  One of the earlier products for requirements tracking and traceability was Rational DOORS. There were a number of other tools we used, none of which integrated very well. Later I was introduced to Rational ClearCase, Rational ClearQuest, Microsoft SourceSafe, Bugzilla, Trac, CVS,  BugTracker, Subversion, Redmine, Jira, Confluence, Team Foundation Server… And on and on.

An early entry in the Rational suite of products, it was a rather expensive program in the way of licensing. If my recollection is correct, it seems that licenses were based on user seats. When such a model exists, and if each seat is expensive, a company may tend to limit the number of product users on a team. The side effect of this approach (again, only if the pricing is high) is limited usage within a team.  I recall project managers working on Gant charts, hosting meetings in which all the developers sat around a table gazing at a projected screen being asked for status updates on each line item. Talk about a yawn fest! Whether it was DOORS, Microsoft Project, some other tool, or a long list of line items with names attached to a number of tasks, the status meetings were always the same.

Some reading the list of products I just threw out will recognize that it isn’t very sensible. Some of the items in the list are used for version control. Some for ticket management. Some for a bit of both. Some of the products are no longer in use. Some are very old, but still in use. (And some, such as Redmine, Jira, and Confluence, aren’t all that old.) This was a problem then (and for many, it remains a problem now): The solutions to our development needs were viewed as separate entities. In addition, we just didn’t seem able to settle on how to use these separate items (which in the past, didn’t integrate well, if at all) in a way that actually helped rather than hindered.

In the early days of my career I almost always had an email inbox full of hundreds of items, some flagged, some marked as unread (so I wouldn’t forget), and some in the trash folder—either placed there deliberately or by accident. My monitor was littered with yellow sticky notes. A pad of paper next to my keyboard was packed with doodles, stars, asterisks, double and triple-underlines. If something was REALLY important, I may have drawn a box around it. For the REALLY REALLY important items, I drew two boxes. And for the REALLY REALLY REALLY important ones… You get the picture.

What a nightmare!

And let’s not forget the weekly status reports! I’m getting flustered just thinking about them! Well-meaning managers asked for status reports every Friday afternoon. When Friday rolled around, I often found myself digging through emails and flipping the pages of my notebook in an attempt to recall what I had worked on. Who has time to fill in a status report in increments each day when distracted by getting actual work done?  I loathed these reports. And I say this without hesitation, as I know I am not alone.

Inevitably, some line item would appear on the list with my name next to it. “Matt,” the project manager asked, “how are you coming with the doodad that implements the whatchamacallit?”

The other day someone asked me this question: “What are your thoughts on the difference between task-oriented and linear oriented?” It was an open ended question, so I wasn’t sure if this question was with regard to software functionality or design and development in general. I’m still not sure what the question means, but I assume it has something to do with the difference between viewing processes as a long line of things to be done in a sequence (with gates between each) and doing things as standalone tasks. Of course, even if something is ‘task-oriented’ there are still dependencies—things that require linear completion. This is why we write tickets that allow us to create a hierarchy of dependencies.

“Uh…” It seemed to happen every time! I was caught off guard by some entirely surprising task. My face felt a little warm, as I struggled to recall just what the hell the project manager was talking about. “Are you talking about the gizmo that ties in with the canootinator?”

“No! Not that… You remember, I sent you an email about the whatchamacallit and the doodad last week. It’s in my sent items. I’ll pull it up right now. Sure, I felt stupid. Maybe I should have. The truth of the matter, however, is that reliance on ineffective communication mechanisms is what led to this (not just for me, but for others as well). Sidenote: Being a software developer requires that one feel stupid every so often. It’s part of it.

As I write this, these recollections seem like the distant past—but the reality is that it wasn’t all that long ago.

(DOORS lives on as an IBM product. I have not used it in years, so I cannot speak to its current state of being.)

We’ve come a long way since then. Mostly. Maybe. Not all of us.

The subject I wish to write about today is effective utilization of tickets. It doesn’t matter what your development process is. Agile, SCRM, Kanban, some iterative process. Whatever the design and development approach—it’s high time to scrap the emails, notebooks, and sticky pads. Need I even mention that it is time to abandon the weekly status reports? (I’m not talking about sprint planning or standups when I say this). I don’t care what the development approach happens to be. Whatever it is, granularity of tasks, detail, and communication remain absolutely necessary (and email does not qualify as effective communication).

Should someone—from your boss to your boss’s boss to a well-meaning coworker, come to the door of your office, cubicle, desk, or couch with the expectation that a conversation is locked in as a done deal upon leaving—shame on all involved!

The same goes for IM and text messages. None of the above forms of communication should ever be considered a final lock-in of a task. There is one and only one way that work items—any work item—should be recorded: The ticket management sytem.

There are many too choose from, some better than others. The poorly-named BugZilla is a fine choice, but Team Foundation Server, Jira, Redmine, and Trac are all great options. Redmine and Trac are entirely free, and they are great tools with plugins for everything under the sun. My personal preference is Redmine, but this could be a bias simply because I have used it so much.  Before I continue, let me make one thing clear: Let’s never again refer to a ticket management system as a ‘bug tracker.’ NAY! To call it a defect tracker leads us back to square one. A defect is a category of a ticket. It may be something of high priority—but all tickets should have an associated priority.

There are a few integrations that I consider absolutely necessary. To make a ticket management system effective, it must integrate with:

  • The  CI (continuous integration) build

CI should integration with version control. It should pester the team when a build breaks. It should make it clear what changeset broke the build, and the changeset should point us to the ticket that was being worked on that prompted the changeset.

  • Email

Wait—didn’t I just say we need to scrap email? I did. Email is a good prompt, however, for team members to see new and changed tickets. For a small team, I like to see all of those emails, even if the task is not related to me. It’s good to know what others are doing.

  • Version Control

Changesets must be concise. As we check-in for commit stuff to our version control (by stuff I mean anything—not just code. That stuff could be configuration files, documents, spreadsheets, etc.).

  • Existing workflow (and if we have to shoehorn the concept of tickets into the workflow, perhaps it is time to rethink things.)

I’ll keep this on-point, as much as I can, as I see a great deal over overlap among the necessities (and for this blog I’ll be writing separate posts (lest my faithful readers become distracted while reading too many words).

WordPress From Scratch

I just submitted my next article to SDJ. This one is tentatively titled WordPress From Scratch, and it is quite large. Look for it in the next issue!

Software Developer’s Journal

A Nicer Tab Autocomplete

If you, like me, rely on tab completion at the command line in Linux or MacOS to help find things for you, it can be annoying when it doesn’t. If you’ve entered a lowercase where the filename has an uppercase, or if there are multiple possible files (in which case double-tab prompts for “show all possible”), autocomplete doesn’t always behave the way one might hope. This is all configurable.

Here’s what I use (these lines are in ~/.inputrc)

set show-all-if-ambiguous on
set show-all-if-unmodified on
set completion-ignore-case on

From the Bash reference manual, here’s what those commands do:


This alters the default behavior of the completion functions. If set to ‘on’, words which have more than one possible completion cause the matches to be listed immediately instead of ringing the bell. The default value is ‘off’.


This alters the default behavior of the completion functions in a fashion similar to show-all-if-ambiguous. If set to ‘on’, words which have more than one possible completion without any possible partial completion (the possible completions don’t share a common prefix) cause the matches to be listed immediately instead of ringing the bell. The default value is ‘off’. completion-map-case If set to ‘on’, and completion-ignore-case is enabled, Readline treats hyphens (‘-’) and underscores (‘_’) as equivalent when performing case-insensitive filename matching and completion.


If set to ‘on’, and completion-ignore-case is enabled, Readline treats hyphens (‘-’) and underscores (‘_’) as equivalent when performing case-insensitive filename matching and completion.



William Durand: From STUPID to SOLID Code

Working Oneself Out of a Career

ResumeThis is going to be a tough question, and I suspect many won’t like the anwer.

Are you working yourself out of a career?

If this question confuses you, chances are you are doing just this. Perhaps you’re the main guy or gal on your project, and your company values your work (for now). What is your work? Are you doing the same thing day after day? Maintaining Oracle Forms? Updating a legacy web site? Parsing through line after line of Visual Basic to keep some legacy system running smooth?

If so, what else are you working on? Anything?

I remember all too well the dark days of 2001 through 2003, when, just after the dot-com bubble, many skilled software developers were out of work with no good options. Any companies that happened to be hiring took full advantage of the fact that the market for software people was flush with resumes of people who were desperate for jobs. The pay dropped dramatically. I lost my job at a startup that failed, and took the first job offer that came along: A technical writer and quality assurance contractor on a federally-funded project. The job, initially, was horribly boring. I literally hated what I was going–but it was a job.

My problem was, being somewhat young, I didn’t have a powerful resume or much experience under my belt. I didn’t have the proof, so to speak, to convey to a potential employer that I would be a great fit for a position. And the sad fact was that, no matter how great I might have been for a job (so I thought), or how hard I was willing to work, there were literally thousands of other candidates out there. The hiring companies had their pick–and believe me–they were picky!

As good as things seem to be for software developers right now, I’ve learned that the market can change quickly. So it is on you to make sure that your skills are diverse and up-to-date. To that end, it is necessary to continually pursue new learning. Let’s face it: Your employer does not have a vested interest in seeing you grow your career. An employer wants an employee to meet their needs as efficiently as possible. That’s fine. Expected, really.

So what are you doing to ensure that you are valuable when the time comes that your employer no longer needs you, or when your employer downsizes or go goes out of business? It’s a question that many of us don’t wish to ponder too much. It sounds so negative, doesn’t it? But it’s an important question.

It isn’t bad to be an expert at something. It’s great, for example, to be an expert C++ programmer. Kudos! But what does ‘expert’ mean exactly? Does it mean that you do one thing very well, but nothing else? It may mean that you are valuable as a C++ programmer, but of no value when it comes to other needs: UI design, Database Design, jQuery, HTML5, Agile, System Administration. A hiring manager would rather hire a candidate with many diverse skills (yet expert in none of them) than an expert in a single skill.

In a single week at work I find myself working on OS scripts, front-end UI design, back-end data access, business services and Rest APIs. My position requires that I have knowledge of the “full stack” (A term I despise!) as some like to call it. This is good. I feel secure in knowing that I’m hire-able (not to sound self-satisfied) based on a wealth of skills. Should I find myself looking (I’m currently quite content), I’ve built my resume up to show that, while I’m an expert in nothing, I’m skilled in many things. This is a better way to go.

This is  a topic that I plan to write more about, perhaps turning it into an article. I want to leave with this, and this is going to sound a little negative if taken the wrong way: If you have been doing the same thing day in and day out for X years, you have not gained X years of experience. You’ve gained 1 year of experience X times in a row. It may be time to consider looking for a new job right now–while employers are eager to find talented software developers. And not just any new job. You need to find a job that will challenge you–stretch you to learn and grow new skills.

(I hope this post doesn’t strike anyone as self-righteous, cynical, or rude. My intent is only to point out a common problem in the career of software developers.)

No Rock Stars Either!

Screen Shot 2013-08-04 at 11.01.00 PMSpeaking of the “Ninja Programmer” silliness, I stumbled about this ad today. Ug. If you’re company is looking for a “Rock Star Programmer,” you may be misunderstanding a great deal.

First Column is Out!



I stumbled upon these videos from “Numberphile” on Youtube. I’ve never been much of a math geek, but these videos are really interesting.



Since SDJ tweeted it today, I suppose it’s okay to announce… I’m pretty excited to have a regular column appearing in Software Developer’s Journal. My first column will contain an introduction–who I am, what I do, and what I hope to write about. Along with it, there will be a column on interview advice, including things that this formerly shy-guy has learned over the years, but as a candidate and as the guy on the other side of the table–the one asking the questions. It is advice that I think is good regardless of one’s career. That said, most of my articles will focus on the life side of software development. While some of my articles will be more technical, I hope to include many that have to do with other aspects of being a software developer. A few ideas for future articles include topics such as how to deal with recruiters, office politics, fitness, and the software glass ceiling.

While I’ve published a number of articles now, and countless blog posts, it will be a new adventure to working on a monthly deadline. I think it will be fun, as writing is something I’ve always loved doing. Having a published outlet for my writing is thrilling. I’m very thankful to SDJ for this opportunity!

Software Developer’s Journal

Coding Horror/The Software Career

LadderI don’t like to just post links to another blog or article. Anyone can do that, and there are far too many blogs out there that create no original content. So I try to write original thoughts and articles. That said, sometimes this is a rule worth breaking. Jeff Atwood has a great post over at Coding Horror titled So You Don’t Want to be a Programmer After All.

Atwood asks the question, “What career options are available to programmers who no longer want to program?” This is converse to a subject I’d like to write about soon (still gathering my thoughts: What career options are their for programmers who wish to move up in their career, perhaps into management, while never losing the ability to actually write code?”

Unfortunately, it seems to me that in this field the general career path goes something like this:

Junior Programmer->Senior Programmer->Super Senior Programmer->Awesome Amazing Programmer->Manager (stop writing software)

I know of at least one person who got into management, didn’t like it, and gave it up to move back into a full-time developer role. What about the programmer who wishes to do both? And why do we assume that software management means an end to coding in the role? Sure, this isn’t always the case, but in general I think it is. It strikes me that many of the best developers move into management, thereby eventually losing their hands-on skills. That seems unfortunate.

The IBM Tank Keyboard


Look at this beauty! This is a clear case of “They don’t make ‘em like they used to.”

I found it laying around the office one day–free for the pickin’! It’s over 20 years old, and not unlike similar keyboards that are even older. It’s heavy. It makes a satisfying click noise when I type on it. They keys rise over a half of an inch. The date of manufacture on this one is 1992, but the copyright date is 1984. On the back, in large bold print, it reads “Made in the U S A.” The cable is long–at least 6 feet! It works perfectly, unlike the standard-issue keyboards that come with the Dell desktops that so many of us are issued at work.

You know the ones. Certain keys stick after a while. Some come loose. You’re never sure if you pressed a key or not because they feel so spongy.

Sometimes people look at this keyboard and ask, “Why in the world do you have that thing?” Others, those in the know, look at it with a certain degree of envy. Unfortunately I cannot attach it to my Mac. I don’t have a PS2 to USB adapter on hand, and I haven’t bothered yet, as I’m not sure if OS X has the necessary drivers.

I wonder what has happened to keyboards. They often seem to be an afterthought. Sony and Mac laptops are decent, but have you tried typing on a Lenovo, Dell, Toshiba, or HP keyboard? It’s downright difficult, even for a man with average sized hands. I suppose the poor keyboard that we are forced to use are a result of cost. Back in the day, I’m sure this IBM keyboard cost and arm and a leg. But here it is–still being used–and still performing with the lasting quality it was designed with.

Wikipedia: IBM  Model M Keyboard

No Ninjas!

Ninja ProgrammerMany of us have seen them: The job posts claiming to be seeking a “Ninja Programmer.”

I presume that these are companies that are:

  1. Looking for a well-versed candidate with diverse skills and the ability to tackle any project.
  2. A candidate that will find more value in the way he/she is perceived than salary. (Reading between the lines: “We can’t pay you much, but we will appreciate you a lot!”) This may not always be the case, but there there often seems to be a hint of this in “Ninja” job descriptions.

The second point is based on other verbiage I have seen alongside such job posts. Things such as “Do you find more value in what you get to do each day than anything else?” Sure, I find value in the more exciting aspects of a role–The opportunity to learn new things, set direction, and get things done. Of course! I also find value in money. Let’s be honest here.

Sometimes the word Ninja is replaced by other crafty (or not-so-crafty) buzzwords: Rock Star, Guru, Genius, Superstar. It doesn’t take much insight to recognize the aim of such verbiage: Flattery.

I’m sure that any company using such lingo in a job description is sincere in the desire to find a candidate who is very good–one who will be able to complete sizable, complex tasks. Naturally! I also think that a single superb programmer can often achieve the work of three, perhaps four or even five, average programmers. I’m fascinated by some of the legendary programmers out there: People like Linus Torvalds and James Gosling. But even the most famous programmers rely on a tremendous and ever-growing amount of community insight and preexisting work. (By the way, there is a video from a Google I/O Conference, The Myth of the Genius Programmer, that addresses this subject very well.)

BloodsportI’ve worked with a few “Ninja Programmers” over the years. The term is highly relative. I’ve had positions where I may have been considered the Ninja. I’ve had other positions where any Ninja-like self satisfaction was as elusive as the stealth and cunning of a Ninja portrayed in a 1980s movie.

How did this lingo come about? Those of us in the business of writing software often have a few other desires. I know I do. Anyone who grew up in the 80s dreams of being a Rock Star, Ninja, or at least Frank Dux. The buzzword job titles are a way of making a job that might be very difficult, taxing, and demanding of time and talent sound appealing. I may have to work 100 hours a week, but at least I’ll finally be a Ninja!

It’s no different than job descriptions that contain the infamous words, “We work hard and play hard!” What does play hard even mean? It sounds like something that might involve torn ligaments.

The point of this post isn’t to seem cynical (although it might). The point is this: Software Developers, Architects, Engineers, whatever you call them, aren’t some strange group of people that have to be wooed or tricked into accepting a position. We’re grown adults. There are certainly great Software Engineers out there. But they aren’t stealthy, and they don’t hide in trees or karate chop bad guys.

I’ve worked with some brilliant software folks over the years. I’ve worked with some very poor ones as well. Those times in my career where I’ve found myself the lone “Ninja” of the team have been among the most floundering times of my career. It is difficult to teach oneself new things in a vacuum. I’ve found that it is best to be on a team with lots of other “smart folks”–people from whom you can learn, and people who will add checks and balances. That so-called Ninja–The lone genius that a company relies on for all software needs–is going to cause a few problems.

A few that I can think of right away:

  1. A lone programmer–the company “genius”–will soon face burnout. No matter how much the individual loves writing software, one can only be stretched so far. This highly talented individual has all sorts of opportunities coming his or her way. It won’t be long before such a talented person is offered a job making more money and working fewer hours. What happens when the single guru leaves the company?
  2. The lone programmer may not play nice as the company grows. It can be difficult to let others touch your baby. When you’ve written thousands of lines of code and a new team member comes along and starts mucking with it, there can be problems. I’ve been the new guy, pestering the old guy, and messing around with legacy code, much of it poorly documented. I’ve also been the guy on the other side, a bit perturbed when someone dare say that my code might be better ifBe gone, you and your new design pattern!
  3. Along with number 2, any programmer with enough of an ego to allow himself or herself to be labelled the company’s Ninja, is likely to have an ego that does not lend itself well to “playing nice with others.” I have to confess once again to having been on both sides of this. It feels great to be in a position where you are thought of as being “the smart guy.” Although burdensome, it feels good to be trusted with the complexities of software that nobody else understands. It also leads to a certain feeling over ownership of code, and heavy reliance on a single individual.
  4. When trusting that lone smart guy/gal with all of the code, a determination has been made: There will be no collaboration–no merging of ideas–no team to challenge each other, from within, to do better. It’s the sharing of backgrounds and experience that leads to the best software design, and I believe this is true no matter how talented one programmer happens to be.

I’m sure there is more that could be added to this list. These are just a few quick thoughts on the matter. While being a Rock Star might not be all that bad, I don’t want to be a Ninja. Sometimes Ninjas get blow-darts stuck in their necks. Sometimes they get beat up by Bruce Lee.

Mac Startup Chime/Badfinger’s Day After Day

The other day I had an epiphany. Okay, epiphany may be a bit of a strong word for this little discovery, but I did suddenly realize why I often get the song Day After Day by Badfinger stuck in my head many mornings at work. It all has to do with the first note of that song and the startup chime on my Macbook. So I created this video to help solve the mystery. And since I created it, I might as well share it. Sure, it’s of little consequence, and probably a big waste of time, but a little curiosity never killed anyone (except for some cat, apparently). I also spent some time reading about the difference between might as well and may as well. It’s pretty much commonly accepted that they mean the same, although some people nitpick and claim a difference. Happy Friday!

Mac Chime/Badfinger Day After Day comparison

Where Are the Females?

I have an idea for an article, but I’m not entirely sure how to approach. IAda Lovelacet’s a subject that I believe some have written about, but as a male, it isn’t a subject that I have given much though to until recently: Where are all the female software engineers?

I suppose the only reason I’ve thought about it at all is because I have two daughters, neither of whom seem all that interested in video games or computers. Sure, there’s some passing interest. They like simple games on Friv. But really, for the most part, any interest in that which may be considered “technical,” ends once they get Pandora open and playing their songs.

I’m definitely not one to declare this an open and cut case of sexism. It could simply be a difference among genders. As someone who loved Lego as a boy, I tried pushing Legos on my daughters. It didn’t stick. I’ve presented video games. No dice. They’ve watching me tinker with Arduino… With only passing interest.

In college I had two female computer science professors. One taught Cobol, the other taught Data Structures, Object-Oriented Programming, C, and C++. This second professor, not a PhD, was one of the best professors I had. She knew her stuff–and she knew how to teach it. Sure, most professors know what they are talking about, but the skill of teaching is something, at least in my experience, that most lack.

I’m trying to think of how many female software engineers I have worked with over the years. I’ve worked with female managers, product and project managers, quality assurance engineers, and technical writers. But when it comes to counting the number of female software engineers I’ve encountered, I think the number is two or three (and only two that I can recall). I do know another female who majored in Computer Science, entered the workforce, worked for IBM, and left the field because she hated it.

While sexism, I think, is an oversimplified answer, I think that simple gender preference is equally oversimplified. After all, there are many female scientists, math teachers, and engineers of other disciplines. May it have something to do with social nature? One can only guess. One would expect just about any professional field to be weighted one way or the other. We aren’t surprised that there are more female than male nurses, or more male than female auto-mechanics. But in these fields, the reason for a gender preference seems somehow a little more clear.

I may not give this a second thought if I had encountered just a few more female software developers. But just two or three seems low enough to suggest that there is something more at play. I don’t suggest for one minute that it has anything to do with bias on the part of men, and I can say this because I personally have not encountered any such bias. I have never once heard men discuss female engineers of any kind in any derogatory manner, nor would I take part in such a conversation (I have a mom, two sisters, a wife, and two daughters–all of them extremely intelligent). Nor do I suggest that the measure of gender equality is equal numbers of men and women in a given field. I think that assumption would be outrageous (correlation doesn’t imply causality, nor does correlation necessarily imply inequality–it may or may not).

About a year ago I was helping my oldest daughter with her math homework. I was shocked when she said, “Dad, I’m just a girl, I’m not good at math.” WHAT! Where in the world would she have heard such a thing? Certainly not from me. Certainly not from any of her teachers (all of them female). When pressed, she could not explain to me why she thought such a thing or where she might have heard it. The only answer I got was, “Math is for boys.” Similarly, I wonder, if for some reason, young girls develop a sense that computers are for boys. And if so, where would this troubling idea come from?

I’m interested in hearing some thoughts on this subject, and if any female software folks happen across this post, I would be especially eager to hear from you.

Ack! Singular Nouns with Plural Pronouns

When Thomas Jefferson wrote “…all men are created equal,” was he deliberately excluding women? Of course he wasn’t! In his time, ‘men’ was a gender neutral and acceptable reference to mankind–male and female. Okay, I grant you that given the state of women’s rights, including voting rights, in Jefferson’s time, some may argue that he wasn’t thinking of women when he wrote those words at the age of 33. But I think he was. And as far as equality in the United States goes, we accept that the United States Declaration of Independence and Constitution apply equally to men and women. But that was then. This is now.

Here’s something that aggravates me. I’ve mentioned it to others, and it seems that this type of grammar butchery is becoming (sadly) accepted. In an effort to avoid “sexist” language (that is, changing written and spoke English to be gender-neutral), an awkward pattern of using improper pronouns has come about. Why aren’t others as aggravated by this as me? Maybe I’m being nit-picky… I don’t think I am.

Here’s an example:

“When a developer works hard, he will learn much.”

The above, these days, is often considered sexist language. Whereas in times past, the singular pronoun he, per the English language, was assumed to be gender-neutral. Beginning somewhat before my time, such an assumption was deemed politically-incorrect and sexist. Fair enough. A more politically and socially correct standard grew.

“When a developer works hard, he or she will learn much.”

This example passes the gender-neutrality test, and it is grammatically correct, but it seems clumsy. In an effort to avoid such clumsiness, some began swapping pronouns, sometimes using she, sometimes using he (within the same article, book, journal, etc.). Grammar Girl has called this practice “Whiplash Grammar” in an article addressing this subject. Personally, I think it is much worse than the awkward “he or she” substitution. As a reader, I find myself wondering who the heck we are talking about.

Some people have substituted the non-word he/she:

“When a developer works hard, he/she will learn much.”

Hmm… This may be okay for an article, but it isn’t something I’d like to see in a book, especially fiction. I cannot pinpoint exactly why I don’t care for this. It just seems messy. In general, a work of fiction can avoid this problem, as the gender of the subject is not likely unknown. All to often I see examples such as this being used:

“When a developer works hard, they learn much.”

Wait just a minute! Developer is a singular noun. They is plural antecedent. Every English teacher I ever had would be quick to point this out, wouldn’t they? Why then have I seen this noun/pronoun confusion used everywhere? I see it in advertisements. I hear it on the radio and TV. I see it in print that has been edited and re-edited! It is a bloody linguistic massacre, and yet it is acceptable enough to pass through an editor’s cautious eye!

I understand the desire to avoid the double-pronoun solution. There is a much better way to handle this. For my final example, a sentence that is proper, gender-neutral, and not the least bit clumsy:

“When developers work hard, they learn much.”

Voila! (Was that so hard?)