The Open Source War, Woke, and the Bozo Bit
A perspective on how the "woke" mindset evolved in the technology industry
"I don't want that shit running on my servers."
My colleague was a smart guy, and I admired his zeal, but I knew it was pointless. That "shit" would be running everywhere soon. Inferior or not, it represented something a lot of people thought was right. I nodded and tried to look serious about it all.
If you've been working in tech long enough, no doubt you remember that time. If you're a software programmer, you may even have taken up arms in a global conflict few industry outsiders know much about:
The Open Source War.
My first real job in computer tech—meaning, employed by a company that produces computer technology—was at one of "those" places; vast sums of money, global presence, product in most homes and businesses. It was around the time early adopters were ordering Apple Newtons.
That job involved being an expert in tools used to build software on the Windows Operating System. If you ran Windows and wanted to create custom software solutions for your business, I could show you how. Predictably, I benefited from successfully advocating Windows as the software platform of informed choice. My clients knew me as a "Windows guy" that worked hard and had surprisingly few programming debacles. I enjoyed some prosperity and never really worried about flipping my Bozo Bit.
Computers don't do computations with letters and numbers. In a nutshell, they use bytes, which are sequences of bits, each of which has a binary value of zero or one. By "flipping a bit," you change what that sequence represents. The dreaded "Bozo Bit" is a binary indicator of a technology professional's perceived value to the organization; once flipped, that value becomes zero. It's generally considered immutable. The Bozo Bit has canceled many a tech career.
That aside, the business of Windows rolled on. There were competitors in the consumer operating system market, such as OS2 and Macintosh. Nobody saw them as real threats. A casual rebuff was usually enough to derail the occasional marketing thrust; "Think Different...do nothing." Overall, the world had embraced Windows with arms wide open.
Why? Because—surprising to many, I'm sure—Windows represented freedom.
Unlike competitors, Windows could (and can) be run on any "white box," that meaning, a computer made by anybody (there's more to it, but that's the basic idea). Other companies manufactured hardware, and you had to buy it if you wanted to run their operating systems. You had no choice.
Most people have no idea how disruptive that difference was; it changed everything. Break the chains! A PC in every house! YOU choose the hardware and can compete in the cash-filthy white-box industry! YOU can develop software for that massive user base! YOU can be the next Dell, or dare we say even Gates (who cares about Jobs, he's broke)! Run Windows, innovate, compete, and prosper!
That messaging won over an army of software developers driven by the promise of a new and more social technology world. No longer oppressed by Sun or IBM, no longer to serve the behemoth mainframe, no longer to chafe at the top shirt button. Behold, on this floppy disk, is software built on my inexpensive machine and on my time, with skills I learned on my own, and when I release it into the Windows galaxy, all will prosper.
Sound ridiculous? Maybe, but it worked. On the ground amongst the software production proletariat—meaning the programmers, or "coders"—we advocated Windows less hyperbolically but otherwise in precisely that manner. The prosperity spoke for itself; Windows was a better way. Students of the new software paradigm eagerly flipped old-guard bozo bits; the conservative computer professional's discourse—think IBM employee—was rejected by the progressive computer code slinger. Two stoner guys could get a couple of cheap PCs, pound through a Windows programming book, and earn a fortune from a kitchen table. Software conference agendas included time for Dungeons & Dragons. The egghead advent had come.
But there was an ideological issue in that progressive utopia. Freedom wasn't free; you licensed it. The more of it you enjoyed, the more you paid, and that arrangement made one company exponentially wealthier than any other.
That beholden state was deemed unfair, and therefore unacceptable. The prosperity of Windows now repulsed the software proletariat, and a movement of software freedom fighters that specialized in "cracking" software protection started. Those renegades were lambasted and—when caught—prosecuted by the software business oligarchy, which drove the common software citizenry to laud them as partisans. Some of the original software pirates, many of them not legal adults at the time, still enjoy a quasi-mythical status.
So, there we were; Windows, once a symbol of equalization and prosperity, was now the unequitable oppressor. Its loyalists were collaborators. Subsequently, the software proletariat splintered off a radical faction that believed the dependency must end. Somehow, someway, Windows must die.
In the meantime of 1991, a single Finnish developer posted the following on a Usenet board (Usenet boards are essentially group message boards):
"I'm doing a (free) operating system (just a hobby, won't be big and professional like gnu) for 386(486) AT clones."
The Open Source War had begun.
"Open Source" is a term used for a particular creed of software development often thought of as Socialist. Developers start software projects, publish them for free, and the software proletariat collectively fixes and improves the code on a volunteer basis. In theory, the software developer community's diverse experience would produce usable software everybody genuinely needed because nobody working on it got paid. In turn, organizations would compete to hire these free-software experts, so it was in everybody's interest to participate.
Setting aside GNU public licensing details and such, suffice to say that by 1996, that Finnish programmer's hobby had evolved into what is now known as the Linux Operating System, and it was in the hands of the masses. "Distros" of Linux—altered and improved versions of the original—proved increasingly useful. The faith was on fire; parity with Windows was inevitable.
It's essential to understand what this meant to the peculiar group that is software developers. Sacrifices of time, resources, and opportunity were ideological badges of honor. The effort galvanized Linux developers; they believed. Ask any experienced technology recruiter. "Religious loyalty" to a given tech was and is a real thing amongst the software proletariat. In those days, a Linux developer may have figuratively—if not actually—spit on a lucrative Windows-based software contract because they perceived Windows as a tool of the master.
Interestingly, at around this time, Sun Microsystems—a now-defunct corporate behemoth that created the SunOS operating system and the expensive computer hardware it mainly ran on—released a software technology known as "Java" into the public domain. That meant Java was more or less free (it technically wasn't open source in the way Linux was, but that detail isn't pertinent). Java had two high-level components: the "Java machine," a bit of software that you could install on a variety of operating systems—including Windows, much to the Windows loyalists' dismay—and the complementary Java programming language and compiler, which coders used to build software that could be run by any Java machine regardless of the underlying operating system. "Write once, run anywhere!" was the war cry. It proved vastly overstated at the time, but Java's promise was a powerful new weapon in The Open Source War.
One might think that Linux and Java coalesced into a unifying banner for the faithful. With these progressive-minded computing tools, we can defeat Windows the Oppressor once and for all!
Not entirely. Java further fractured the programming world. Centrist-types extolled the apparent benefit of writing one software version that could run on Windows, Linux, and any other operating system with a Java machine available. Even an organization with an inexorable investment in Windows could obtain and use software and services that were not part of the Windows licensing paywall. Windows sales and technical messaging reacted by slamming the immaturity and then-stability issues of Java; the people who best understand Windows had nothing to do with this upstart trash; there's no reasonable guarantee of security or stability. It's dangerous. Open Source purists insisted that because Java could run on Windows, it was too non-committal. Coexistence was capitulation. The Windows discourse must not propagate because it is wrong.
What is it about coding that fosters such high-mindedness and a tendency to polarize its practitioners ideologically? Hard to say, but maybe easy to see. Aside from making money, creating software is about implementing ideas to enhance productivity and workflows. Writing code requires robust cognitive function; coders spend a great deal of time considering the best ways to create and control complex systems that can behave in unforeseen ways. Programmers must evaluate bug fixes in terms of overall systemic effect. Programming paradigms can be incredibly nuanced and require a measure of obsessiveness to grasp fully; is it better to model systems as interacting actors or as interacting instructions? Like writing music—which involves competencies that overlap with programming—practical measures of cost, time, and effort aren't first-order. Code can indeed be sublime. Throw into the mix a penchant for science fiction and fantasy—every coder group probably has enough Magic the Gathering players to start a tournament—and a measure of uniform social awkwardness, and there you have it; fertile ground for the disruptive mind.
Factions and ideologies aside, the Open Source faithful remained diligent and soon resolved the essential technical issues; Java and Linux became part of the enterprise-scale business landscape. The radical progressive's goal of eradicating Windows remained unachieved, but Open Source did win the battle to survive and even thrive as a valid technology paradigm. Marketing and consulting skirmishes continued; Windows offered a consistent and predictably supported technology, but licensing was a stranglehold. Linux and Java offered freedom from licensing, but at-scale solutions could be piecemeal and messy. It all became page-seven news to the majority of practical technologists.
This relative equilibrium rolled on, and I capitalized by curating a horizontal skill set that cut across operating systems and technologies. The approach reflected my practical and centrist leanings; it's your business. Hopefully, you've selected technologies that support your organization's capabilities and goals. Internal ideological concerns are not my affair; in the end, it's all just a matter of getting zeros and ones to do what you want. As for prosperity, the overall pay vs. effort ratio had diminished, but I'd been in the game for years; few independents had my experience. Work in both camps was easy to find.
Still, there seemed to be a fringe that maintained and even expanded the narrative of injustice. But in the practical sense—to me anyway—the big fight seemed to be over, and what remained I waved away as unprofessional foot stamping.
Then, some years ago, I was in a Facebook discussion with a colleague from those first-real-job halcyon days. Our professional paths had diverged, but we enjoyed relatively similar measures of success. I remained a code slinger, entrenched in the craft of professional web application engineering. My colleague had pursued greater organizational involvement at a well-known technology corporation.
In the conversation's course, he blindsided me with the following:
"You know it was easier for you to get the job than me, right?"
I blanched. The story of how I acquired the skills to qualify for a highly sought-after interview in the technology industry—and then getting the job—is pretty gritty. After college, I worked as a news journalist for a couple of years, but it wasn't paying the bills. I'd always been interested in programming and had bought a cheap Windows computer, but I had no money for learning materials or software, not to mention a pile of loan debt. I "stole" discarded technology books and magazines from dumpsters—didn't know that was a crime—got kicked out of many a computer store, and worked at the Village Voice by night so I could sit in temp agency computer labs during the mornings. I most definitely lacked polish. At one of my first temp placements as a "digital publisher," the executive I was assigned to called her internal HR department and loudly complained--I was standing in her doorway at the time--that they had sent her "some white trash kid."
I shook it off, racked up results, and eventually got a break at a large financial company, which led to a shot at a "real" technology company, and so on. All to be sitting in my ergonomic chair, chatting online with a former colleague who was accusing me of something called "systemic racism."
I responded predictably; what the hell? You call me a friend and then say I'm part of what? You know me. You know something about my background. You know the odds-on bet for me was to fail. Where are you getting this?
His online friends, evidently aware of the thread, replied that I needed to "examine" why I perceived this as insulting and unfair. If I did so successfully, I would understand my complicity and gratefully accept the responsibility of working to undo it. My individual experience was irrelevant.
It seemed the industry progressive's target had shifted from the particular technology people produced and used to the particular people that produced and used technology. And somehow, through no individual action of my own, I had wound up on the downside.
Fortunately, a focus on independent contracting and startups isolated me from the effect of all this on the workplace. "Do good work and don't be an asshole" seemed a good enough modus operandi at the time.
Fast forward to 2014. I'd exited a startup, took some time off to design and build a project related to recruiting, recorded an album of instrumental guitar music, did some traveling with my sammanboende partner, and overall was pretty happy with the way things were going. However, exciting things were happening in tech, and I didn't want to get caught standing still, so I polished my chops and let a couple of agents know I was available.
A few months later, I landed work at a "company of engineers" that happened to be one of the wealthiest companies ever. The entitlement and prestige astounded me; I'd seen some of it in startup environments, but never at such a large scale. Every food to satisfy every diet was available in practically unlimited quantities at numerous onsite restaurants and cafeterias, all free. Employees made and canceled massage appointments, enjoyed showers and steams, lounged in the nap rooms and massage chairs, did their business in fully enclosed and sonically isolated bathroom "studios," attended empowerment sessions, and enjoyed video arcades. The office facilities spanned multiple city blocks; new employees were often late to meetings due to underestimating the walking time.
I was awed, but not completely surprised. I'd heard of the new office opulence. Tech was money-saturated, and this company had the lion's share of it. Youth culture was strong; most had gone to college for things other than business-related majors. I found it refreshing; I'd hated being "buttoned-up" and was pleased to see the once-important Brook's Brothers facade going the way of the dodo.
What did give me a stranger-in-a-strange-land feeling was the reification of political correctness. It's one thing to hear about "PC" as a fringe ideology; it's another thing to see the ideas physically implemented, but again, that's what engineers do; implement ideas.
Gender-neutral bathrooms, rainbow celebrations in the hallways and gathering areas, radical liberal political t-shirts, posted statements of diversity, inclusion and the importance of feelings and sensitivity were everywhere. The college-primed, left-aligned mindset--so commonly found in young engineers--had capitalized and taken the helm, which enabled a much broader and more present ideological net than I'd seen in the past. My fundamental doctrine of "do good work and don't be an asshole" suddenly seemed naive.
I appreciated what all this could mean to people; being socially ostracized, bullied, teased, and overall just being thought of as "uncool" has been problematic for many. Here, it seemed a lot easier for "you to do you."
Still, there was an undercurrent; it registered on me as a company-wide shoulder chip. There was a shallow threshold for "toxic," and it was actively citizen policed. Sports metaphors raised objections. Hipster, goth, earthy, androgynous, trans, even full-on dirtbag; the signaling could be overwhelming. Preferred presidential candidates and the political party of-choice were outwardly uniform; dissent was not healthy for one's interoffice relationships.
I tried to think of it all as predictable. The traditional corporate culture had shut out a lot of people. Some territorial defensiveness, an occasional sense of restrained hostility, hard-left ideological leanings; it didn't seem entirely unreasonable.
Or so I thought. There were contradictions I couldn't get around, first and foremost among them being "badge culture."
If you're not familiar with "badge culture," I could (and probably will) write a whole piece on it, but it boils down to this; every employee gets an ID card that must always be prominently displayed. The badge color indicates employment status: FTE (full-time employee), intern, contractor, service (cleaning crew, etc.). Based on that at-a-glance status, employees often behave very differently toward each other. It's not subtle at all. The FTE badge enjoys (and expects) deference; FTEs typically use each others' names but may refer to non-FTE employees by their badge. If there's confusion about an employee's status (is so-and-so FTE?), somebody may clarify by way of badge color (no, they're a [color] badge). Badges may not mix much (or at all) outside of an obligatory luncheon or some such. To quote a security guard that was addressing a new programming contractor, "Ah, a [color] badge...the lowest of the low."
Had I seen this behavior elsewhere? Sure, but those places made no bones about status expectations. Executives were supposed to look and behave like executives, and employees deferred to that status. Subordinates behaved subordinately. Fair or not, you signed on with that understanding.
Here I thought I saw something different. Inclusion was of first-order importance. Popularity, assertive authority, material prestige; the cultural architects supposedly weeded out these conservative corporate culture elements. But as far as I could see, they distilled it all down to playing card-sized colored badges.
That sort of thing always bothered me, and sometimes I didn't hide my irritation so well. A couple of years saw the blush fade and the Kool-Aid wear off. As it turned out, the Corporate Tower of Progressive Privilege and woke-in-the-workplace wasn't for me. The Bozo Bit was no longer an indicator of competency as a technologist; it was now an indicator of expected social and political disposition, and at that place, to my dismay, it had seemed possible that mine could get flipped.
I'd done good work there, so a relatively amenable exit allowed me to duck the new technology woke's sustainably sourced bullet. I got involved in long-term consulting elsewhere, and on it goes. But, the merit-based bastions with simple expectations for good work and reasonable conduct seem to be circling the wagons.
What do I think? I don't know. But I can say this:
I'm glad I'm not just getting started.