Lunduke
Comedy • Gaming • News • Science & Tech
"If this one guy got hit by a bus, the world's software would fall apart."
(Funny? Yes. But the reality is far worse...)
April 04, 2024
post photo preview
  • How many critical software packages are maintained by a small, unpaid team (or, worse, a single person)?
  • What happens when that person gets bored with the project... or decides to do something malicious (as in the case with a recent backdoor in the XZ compression tool)... or... gets hit by a bus?

These are not only fair questions to ask... but critical as well.

The reality is that we're not simply talking about a handful of key software packages here -- the entirety of our modern computing infrastructure is built on top of thousands of projects (from software packages to online services) that are built, maintained, and run entirely by one person (or, when we're lucky, 2 or 3 people).

One wrong move and the Jenga tower that is modern computing comes crashing down.

Source: xkcd

Just to give you an idea of how widespread -- and dire -- this situation truly is, I would like to call your attention to two projects that most people don't even think about... but that are critical to nearly every computer system in use today.

The TZ Database

Dealing with Timezones in software can be tricky.  Many rules, many time zone details.  As luck would have it, a standard database (TZ Database) was built to make it easier for software projects to get those details right.

And, every time those timezone details (across the world) are changed -- something which can happen several times per year, often with only a few days notice -- that database needs to be updated.

What happens if those details are not updated... if the timezone data is incorrect?

At best?  A few minor scheduling inconveniences.  At worst?  Absolute mayhem... computer-wise.  Times can become significantly out of sync between systems.  Which can mess up not only scheduling (an obvious issue), but security features as well (as some encryption tools require closely synced time).

To give you an idea of how widespread the TZ Database is, here is just a teeny tiny fraction of the number of software projects which rely upon it:

  • Every BSD system: FreeBSD, OpenBSD, Solaris
  • macOS & iOS
  • Linux
  • Android
  • Java, PHP, Perl, Ruby, Python, GCC, Javascript
  • PostrgreSQL, MongoDB, SQL Server

Yeah.  It's basically a list of "all software".  And that's just a sample of the software which heavily relies on the TZ Database for making sure timing (and everything that is time-critical) is correct.

Now.  With something this absolutely critical, surely a highly paid team of people -- from multiple companies -- is responsible for keeping it updated... right?

Oh, heavens, no.

Two people.  Two!

While the database itself has been officially published on ICANN (the "Internet Corporation for Assigned Names and Numbers") servers for the last few years, only 2 people actually maintain the TZ Database.

SQLite

Did you know that SQLite is the most used database system in the entire world?  More than MySQL, MS SQL Server, and all the rest of them.  Good odds, SQLite is used on more systems than all other database systems in the world... combined.

In fact, SQLite is a critial component in the following systems:

  • Android, iOS, macOS, & Windows
  • Firefox, Chrome, & Safari
  • Most set top boxes and smart TVs
  • An absolutely crazy number of individual software packages (from Dropbox to iTunes)

Now, ready for the fact you knew was coming?

SQLite is maintained by... 3 guys.

Not "3 lead developers who oversee an army of open source contributors"... just 3 guys.  Total.  And they don't accept public patches or fixes.

"SQLite is open-source, meaning that you can make as many copies of it as you want and do whatever you want with those copies, without limitation. But SQLite is not open-contribution."

A piece of software that is practically the cornerstone of modern computing.  Trillions of dollars worth of systems relying upon it -- every second of every day.  3 guys.

Corporations rest on the shoulders of... a couple volunteers

Add those two projects together.  5 guys, in total, are responsible for Timezones and SQLite databases.  Software and data used on practically every computer on the planet.

And that's just the tip of the iceberg.  Critical projects -- often with small teams of (more often than not) unpaid voluneers -- form the core of the vast majority of major software projects.  Including commercial ones.

ImagemagickXZFFmpeg?

You'll find those at the heart of more systems than you can count.  Good odds you use all three, every day, and don't even notice it.

And, as the small team behind FFmpeg pointed out in a recent X post, getting those large corporations to contribute -- in any meaningful way -- can be like pulling teeth:

The xz fiasco has shown how a dependence on unpaid volunteers can cause major problems. Trillion dollar corporations expect free and urgent support from volunteers.

 

Microsoft / Microsoft Teams posted on a bug tracker full of volunteers that their issue is "high priority"

 

 

After politely requesting a support contract from Microsoft for long term maintenance, they offered a one-time payment of a few thousand dollars instead.

 

This is unacceptable. 

 

We didn't make it up, this is what Microsoft actually did:
https://trac.ffmpeg.org/ticket/10341#comment:4

 

The lesson from the xz fiasco is that investments in maintenance and sustainability are unsexy and probably won't get a middle manager their promotion but pay off a thousandfold over many years.

 

But try selling that to a bean counter

In short: Microsoft wanted to benefit from the (free) work done by FFmpeg... but was only willing -- at most -- to toss a few peanuts at the team.  And, even then, that (mildly insulting) offer of meager support was only done when Microsoft needed assistance.

A few parting thoughts...

There are valuable lessons to be learned from all of this -- including the need for real, meaningful support (by large corporations) of the projects they rely so heavily upon.

But, for now, I'd like to leave you with a few observations.

  1. Corporations don't hesitate to throw large sums of money at Tech Trade Organizations (such as The Linux Foundation -- which brings in hundreds of Millions every year from companies like Microsoft)... yet they are hesitant to provide significant funding to projects they rely directly upon to ship their own, often highly profitable, products (see the projects listed earlier in this article).
  2. How many of these smaller projects -- which Linux desktops and servers rely entirely upon -- receive regular funding from The Linux Foundation (or companies which fund The Linux Foundation)?  I'll answer that question for you: Next to none.
  3. Even high profile Open Source projects -- such as KDE or GNOME -- struggle to bring in enough funding to afford two full time developers on payroll.
  4. We have avoided catastrophe, thus far, through dumb luck.  The recent XZ backdoor, for example, was found by a lone developer who happened to notice a half second slowdown... and happened to have the time (and interest... and experience) to investigate further.  The odds of that being discovered before significant harm was done... whew!... slim.  So much dumb luck.

Go take a look at that XKCD comic at the begining of this article again.  Funny right?  And it makes a solid point.

You know what's terrifying, though?  The reality is far more precarious. 

There's not simply one project -- by one guy -- holding all of modern computing up.

There's thousands of projects.  Each made by one guy.  And hundreds of those projects (at least) are load-bearing.

Dumb luck only lasts for so long.

community logo
Join the Lunduke Community
To read more articles like this, sign up and join my community today
18
What else you may like…
Videos
Podcasts
Posts
Articles
On the Z-80 Holborn Computers

Remembering the (very) funky Holborn computers of the early 1980s.

The full article: https://lunduke.locals.com/post/5588902/1950s-sci-fi-style-computers-powered-by-a-z80-built-in-holland

00:14:04
On The History of Screensavers: 1961 - 1990

From Sci-Fi novels and Atari... to old Macs and Flying Toasters.

The full article: https://lunduke.locals.com/post/5588984/the-definitive-history-of-screensavers-1961-1990

00:18:01
Mozilla: A Bully from the Very Beginning

The story of how "Firefox" was named.

Read the full article: https://lunduke.locals.com/post/5577706/why-is-firefox-called-firefox

00:19:45
November 22, 2023
The futility of Ad-Blockers

Ads are filling the entirety of the Web -- websites, podcasts, YouTube videos, etc. -- at an increasing rate. Prices for those ad placements are plummeting. Consumers are desperate to use ad-blockers to make the web palatable. Google (and others) are desperate to break and block ad-blockers. All of which results in... more ads and lower pay for creators.

It's a fascinatingly annoying cycle. And there's only one viable way out of it.

Looking for the Podcast RSS feed or other links? Check here:
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

The futility of Ad-Blockers
November 21, 2023
openSUSE says "No Lunduke allowed!"

Those in power with openSUSE make it clear they will not allow me anywhere near anything related to the openSUSE project. Ever. For any reason.

Well, that settles that, then! Guess I won't be contributing to openSUSE! 🤣

Looking for the Podcast RSS feed or other links?
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

openSUSE says "No Lunduke allowed!"
September 13, 2023
"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044

This episode is free for all to enjoy and share.

Be sure to subscribe here at Lunduke.Locals.com to get all shows & articles (including interviews with other amazing nerds).

"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044

“Intelligence" is doing dumb things really fast

post photo preview
post photo preview

AI allows country music legend who lost his voice to cancer, to clone his voice from his prior works and then create new music.

https://x.com/rowancheung/status/1787347330376605893?s=61&t=6up3xiAHCe_GkYXl_0xgnD3EVmXFWjHB4VSy2GBijek

post photo preview
The 1948 precursor to the hard disk
A brass rotating, magnetic drum... inspired by a voice dictation machine.

Floppy disks. Zip disks. Hard disks.

These sorts of spinning, magnetic storage mediums have been critical to several decades of computers. It’s almost hard to imagine the computers of the 1970s, 80s, and 90s without floppy and hard disks (and other magnetic drives).

But how, exactly, did they come into existence?

Let’s take a quick look at the very first of such devices… and their inspiration.

1946 - The Mail-a-Voice

We’ll begin our journey with the 1946 release of the Brush Mail-a-Voice.

The Mail-a-Voice.  Be Honest.  You want one.

A truly fascinating device, the Mail-a-Voice looked like a phonograph… except it used a paper disc that was coated in magnetic material. You could then record up to 3 minutes of audio on a single paper disk (which would spin at 20 rotations per minute)… and then fold the paper disc up and mail it to someone inside a standard envelope.

Thus the “Mail-a-Voice”.

This device didn’t store computer data itself -- it was only for audio -- but it did inspire engineers who were working on cheap data storage for computers…

February, 1948 - Andrew Booth’s Spinning Drum

In a 1947 trip to the USA, Andrew Booth (who was working on his own computer design), had the chance to see the “Mail-a-Voice” in action.

Since Booth needed a good, inexpensive storage medium for his computer… he attempted to build a similar device using a flat, paper, magnetic disk. What was, essentially, a first attempt at what we would now call a “Floppy Disk”.

Unfortunately, it didn’t quite work. Booth found that he needed to spin the paper disk quite a bit faster in order to make it viable as a data storage mechanism… and he had a hard time keeping the paper disk flat.

In fact, as Booth upped the RPM to 3,000 — which is what he determined he needed — the paper disk itself started to disintegrate. Booth would later comment:

“I suppose I really invented the floppy disc, it was a real flop!”

So he abandoned that approach and, instead, decided to use a brass drum. Why brass? Because brass is a bit less likely to disintegrate than… paper.

It's brass, baby!  BRASS!

This system worked. His brass, rotating drum (with a magnetic coating), had a 2 inch diameter and could store 10 bits per inch.

Yeah. 10 bits. Per inch.

Not exactly massive amounts of storage. But, hey, it was a start! And it didn’t disintegrate! Huzzah!

Improving the magnetic drum

With the first prototype working, Booth set about improving his magnetic, rotating drum storage device. The final version ended up being able to store 256 words of either 20 or 21 bits each (different sources cite different values here and there does not appear to be consensus on if it was 20 or 21 bit words).

In modern terms: This would be equivalent to roughly 5 kilobits of data storage.  Give or take.

This storage drum was put to use on the ARC (the Automatic Relay Computer).

Booth working on the Automatic Relay Computer.

When all was said and done, the ARC could utilize that storage drum to handle 50 numbers and could load a program consisting of 300 individual instructions.

It wasn't exactly a “Hard Disk Drive”… more of a “Hard Drum Drive”.

Either way… Pretty darn cool for the 1940s.

1956 - The First “Hard Disk Drive”

Over the years that followed, this idea was refined and improved. The rotating drum was abandoned for hard, magnetic platters — ones sturdy enough to handle much higher RPMs (certainly much sturdier than paper!)... and thus leading to faster data access.

These improvements eventually leading to the 1956 release of the IBM Model 350 Disk Storage Unit for the IBM 305 RAMAC computer.

IBM Model 350 Disk Storage Unit

The Model 350 Hard Disk Drive, in a base configuration, could store roughly 3.75 MB — all contained in a cabinet 5 feet long, 2 1/2 feet deep, and 5 1/2 feet tall — with platters spinning at 1,200 RPM.

And all thanks to a voice dictation device built for mailing 3 minutes of audio on a folded-up piece of paper.

Read full Article
post photo preview
Who (really) created the "Byte"?
And what is the REAL definition of term?

Kilobytes (KB). Megabytes (MB). Gigabytes (GB).

We use these storage measurements every single, gosh-darned day. And most of us feel like we know exactly what they mean. But do we really?

Do we really — truly — know what a “Byte” is… and its origin? I mean… who came up with the term “Byte”, anyway?

Let’s take a moment to look over the history of the term. If, for no other reason, than to feel smarter than most other nerds.

What is a “Byte”?

If you ask Mr. Google, a Byte is exactly 8 Bits.

Mr. Google wouldn't lie... right?

Ok. Great. 8 Bits = 1 Byte.

So what is a Bit?

That part is simple.

A Bit is the smallest unit of information for a digital computer. A Bit can have two possible values… 0 or 1. It is a boolean. A binary.

Many people believe “Bit” is short for “Bite”. You find this in many computer history books. This little tidbit has been repeated so often, many believe it. However, like many such oft-repeated anecdotes in computing… it’s hogwash.

In fact, “Bit” is an acronym for “Binary Information Digit”. Squish that phrase together and you get “Bit”.

Fun factoids about the origin of the “Bit”

The first usage of the word “bit”, when talking about a type of data in reference to computing, was by Vannevar Bush. He published an articled entitled “Instrumental Analysis” in the October, 1936 issue of “American Mathematical Society”. In it he used the phrase “bits of information” when talking about punch cards.

However 

“Bit” was commonly used in Middle English to refer to “a mouthful” or a “morsel” of food. (This is the origin of why many believe “Bit” is short for “Bite”… even though it isn’t.) As such, Vannevar Bush may not have actually been thinking about a “Bit” as a “Binary digit”… instead he may simply have thought “this is a morsel of data”. Also worth noting… Bush never actually defines what a “bit” is. Making it likely that he was simply using the word “bit” in the Middle English way.

The first — distinctly verifiable — usage of “Bit” in this way is by John Tukey. From “A Mathematical Theory of Communication” written by C. E. Shannon in 1949:

“The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.”

There you have it. More information about the origin of the term “bit” than you ever wanted to know.

You’re welcome.

Ok. Great.

So, in short, a Bit is a 0 or 1. And a Byte is a group of 8 Bits. Easy.

Not so fast there, sport!

While the Byte being 8 Bits is commonly accepted today… that was not always the case. Not by a long shot!

In fact, there are two competing stories for who created the term “Byte”… and neither of them were referring to a set of 8 Bits!

Seriously!

Werner Buchholz’s 6 Bits

The most often cited creator of the term “Byte” is Werner Buchholz — who used the term, in 1956, to refer to a grouping of 6 Bits when working on the IBM Stretch Super computer.

Man sitting at IBM Stretch console. Image source: computer-history.info.

A “6 Bit” Byte was common in those days. In fact, Braille was a 6 Bit encoding of characters for the blind. And many of the early computers (from IBM and others) used 6 Bit groupings to encode character data.

6 Bits -- not 8 Bits -- per Byte.

However (you knew there had to be a “however”)…

Louis G. Dooley’s N Bits

Around that same time (1956 or so), Louis Dooley first used the word “Byte” to refer to an undefined grouping of “Bits”. But, typically, used as “4 Bits”.

That's right.  Not 8 Bits.  Not 6 Bits.  But 4 Bits.

Dooley published the following letter in BYTE magazine:

“I would like to get the following on record: The word byte was coined around 1956 to 1957 at MIT Lincoln Laboratories within a project called SAGE (the North American Air Defense System), which was jointly developed by Rand, Lincoln Labs, and IBM. In that era, computer memory structure was already defined in terms of word size. A word consisted of x number of bits; a bit represented a binary notational position in a word. Operations typically operated on all the bits in the full word.

 

We coined the word byte to refer to a logical set of bits less than a full word size. At that time, it was not defined specifically as x bits but typically referred to as a set of 4 bits, as that was the size of most of our coded data items. Shortly afterward, I went on to other responsibilities that removed me from SAGE. After having spent many years in Asia, I returned to the U.S. and was bemused to find out that the word byte was being used in the new microcomputer technology to refer to the basic addressable memory unit.

 

Louis G. Dooley
Ocala, FL”

So… what the heck is a “Byte”?!

That’s right. We now have two very, very different definitions for the word “Byte”. Both creations of the word happened independently… and at almost the exact same moment in time.

  • The Buchholz Byte” - A grouping of 6 Bits.
  • The Dooley Byte” - A grouping of an undefined number of bits, less than a full word size. Often used to describe 4 Bits.

You’ll note that neither of these definitions — from the men who created the term — have the number “8” in them.

The shift towards 8 Bits per Byte started to happen in the 1970s… with the development and gaining popularity of 8-Bit processors, such as the legendary Intel 8008.

A revision of the Intel 8008 CPU

Interestingly, some of those early 8-Bit CPU’s had specific functions for handling 4-Bit chunks of data. Because, up until that point, 4 and 6-Bit “Bytes” were incredibly common (including in the predecessor to the Intel 8008… the 4-Bit Intel 4004).

Fun Factoid: Nowadays a 4-Bit group is called a “Nibble”. Which is adorable.

For quite some time the term “octet” or “octad” was used to denote 8 Bit groups. At some point along the way, most people phased that out as well… simply referring to all “groups of bits” as a “Byte”. Though you will still find “octet” used here and there, especially when talking about various network protocols.

All of which means…

Dooley invented the modern “Byte”… not Buchholz

While many writers, enthusiasts, and computer historians are quick to say that Werner Buchholz coined the term “Byte”… they are obviously mistaken.

Besides the fact that it’s hard to discern who (Dooley or Buchholz) actually used the term first… the Buchholz definition is no longer used at all in modern computing.

The Buchholz definition is specific. 6 Bits. Which modern computing has determined is not the amount of Bits in a modern Byte.

The Dooley definition, on the other hand, allows for wiggle room. Which means that an 8 Bit “Byte” would fit the Dooley definition. But not the Buchholz.

The facts are clear: Louis G. Dooley created the word “Byte”. At least as it has been used for the last 40+ years.

But Buchholz — an absolute legend in the computing world — gets one heck of an Honorable Mention trophy.

Read full Article
post photo preview
The History of Ctrl-Alt-Delete
How the "Three Fingered Salute" came to be.

Ctrl-Alt-Del — sometimes known as “The Three Fingered Salute” — is among the most recognizable keyboard commands in the entire computer world. Restarting, or logging into, countless computers since the 1980s.

But... what a peculiar combination of keys!  How, exactly, did it come to be?

Let’s take a tour through the history of this beloved / hated / mocked key combination.

Not the first multi-key reset

Control-Alt-Delete may be the most famous “reset this computer” key combination… but it wasn’t the first.

That honor goes to the Exidy Sorcerer in 1978. A Z-80 powered home computer that never saw the commercial success of its rivals.

The Exidy Sorcerer

Note the two “Reset” keys in the top right of the keyboard.

Ok, that simply is too small to make out.

ENHANCE!

RESET!  RESET!

Much better.

Here we see the two “Reset” keys.

How do you hard reset an Exidy Sorcerer? You guessed it -- press both of these keys at the same time.

In theory this was to make it harder to accidentally reset a machine... having a single "Reset" key would simply be too easy to tap without intending to.  But they put the two keys immediately next to each other.  And right next to "RETURN" -- which you would always be reaching for with a pinky.  Strange keyboard layout choice, right?

It's like putting a "Nuclear Self Destruct" button right next to the "Make a Cup of Coffee" button.

Regardless, the Sorcerer still wins the title of “first computer with a multi-key reset”. So it’s got that going for it.

The IBM 5150

Flash forward to 1981, in Boca Raton, Florida. A team of engineers was about to release the IBM 5150 (aka “The IBM Personal Computer”).

(Yes. The IBM PC was crated in Florida. That random little tidbit doesn’t get talked about much.)

The IBM 5150 Personal Computer

One of the engineers working on the BIOS of the 5150, David Bradley, implemented a three-key reset for the team within IBM (and partners such as Microsoft) to use during development.

A convenience feature that was never intended to see the light of day. Three keys that would quickly reset the entire machine without needing to do a hard “Power off and Power back on”.

That three-key combination?

ControlAltEscape.

“Sorry, Lunduke. You wrote that wrong. It’s Control-Alt-Delete. Not Escape.”

Not at first. In those early days, the key combination was “Ctrl-Alt-Esc”. That’s how the IBM 5150 was originally reset.

That, right there, is a good looking keyboard.

However, all three of those keys being on the left hand side of the keyboard made it too easy to accidentally bump.  You might as well have two "RESET" keys right next to each other (how crazy would that be?).

So the lead programmer of the project, Mel Hallerman, suggested changing “Escape” to “Delete” (which was on the complete other side of the keyboard). Thus making it much harder to accidentally hit.

And, just like that, Control-Alt-Delete was born.

It was not supposed to ship

Considering how instantly recognizable the "Three Fingered Salute" is nowadays, it seems wild to think that it was never intended for the public to even know about -- it was strictly for internal development purposes.

In fact, it barely received any development time at all according to the man who developed it.

“It was five minutes, 10 minutes of activity, and then I moved on to the next of the 100 things that needed to get done.” - David Bradley

David Bradley, the father of Ctrl-Alt-Del.  Photo credit: AP

All that changed when someone included the details of “Ctrl-Alt-Del” in the technical manuals for the IBM Personal Computer.

Here you can see it documented in the “IBM 5150 Guide to Operations” (where it is detailed not once… but three times):

Source: IBM 5150 Guide

At which point… the cat was out of the bag. Ctrl-Alt-Delete was documented and publicly known (and used) by a commercially successful computer.

There was no turning back now. It was a standard. Even if it was never intended to see the light of day.

And, to think, we were this close to having Ctrl-Alt-Escape instead. (Let’s just thank heavens we didn’t get stuck with the double RESET keys…)

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals