Lunduke
Comedy • Gaming • News • Science & Tech
Microsoft executives encourage employees to commit felonies... and then they get promoted.
(Including Hate Crimes, and criminal sexualization & sterilization of minors)
April 14, 2024
post photo preview

Recently, we learned some incredibly troubling things about Microsoft

Namely that the Redmond software giant:

  1. Encouraged employees to "transition" their children -- as young as 3 years old -- to different genders.
  2. Implemented a non-optional health plan which specifically covers "gender affirming care" for small children.
  3. Supports removing "age restrictions for gender-affirming treatments for children under 18".

Let's put aside -- just for a moment -- any personal opinions we may have about the idea of "gender transitioning for toddlers".  Because, regardless of our own feelings on the topic... it remains, in the United States, illegal.

In fact, 22 states have passed laws which reinforce the fact that "gender transitioning" of minors is a crime.

Which brings us to a question worth pondering:  How could Microsoft -- a company with a quarter of a million employees (across all 50 states) -- actively encourage blatant, criminal activity?

Any functional Human Resources department would put a stop to that.  The same goes for executives, board members, or internal legal teams.  Any company advocating for their employees to commit crimes (regardless of our personal opinions on said crimes) is heading for disaster.

That is, quite simply, obvious.

So how on Earth is this happening?  Is no manager, executive, board member, or part of the HR team objecting to this?  Is the Microsoft management team really pushing for criminal activity?

Well buckle up, buttercup.  Because it turns out that at least one Microsoft executive has been publicly advocating for people to commit crimes (in general) for years.

And the Microsoft HR team has not only allowed it... but promoted him.

The VP who calls for criminal activity

Over the years, Microsoft Vice President Scott Hanselman has publicly encouraged people (including Microsoft employees) to commit crimes in the name of "Diversity".

Which is an incredibly vague thing to say.  "What crimes, exactly?  Just crimes... in general?!"

Yes.  In general.  Just any... "Crimes".  Seriously.  It's strange, I know.

Case in point, back in 2018, this Microsoft VP (then an upper level manager) gave a keynote presentation where he encouraged people to commit criminal activity -- specifically against white people and men.

During that keynote, Hanselman made the point that it is not enough to be an "ally"... that one must be willing to be an "accomplice".  Saying, specifically, "Accomplices will go to jail with you."

Just to be sure there was no confusion about what he was encouraging -- he created a slide which read, "To create an inclusive tech environment we need accomplices more than allies."

Source: Twitter

I happened to be in the audience on that particular day -- where Hanselman repeated, multiple times, the need for people to be ready to "go to jail", to be an "accomplice", and to "commit crimes"... against white males, specifically.

No ambiguity.  Not a joke.  A serious request for Microsoft employees (and other tech tech workers in attendance) to commit crimes against people based on ethnicity and gender.

Just so we're all on the same page, here's the definition of "accomplice":

Accomplice. Noun.

 

A person who knowingly helps another in a crime or wrongdoing, often as a subordinate.

 

Example: “an accomplice in the murder”

This singular keynote -- where Hanselman was representing Microsoft -- was not an anomaly.  This is a message that Hanselman has pushed for years.  Including on Microsoft corporate podcasts, where he said the following:

"And even that term ally is loaded.  I think of it more as like, Advocate.  Or maybe when appropriate, accomplice.  You know what I mean?  How far are you willing to take this?  You know what I mean?"

Be an "accomplice."  "How far are you willing to take this."  "Crime."  "Willing to go to jail."

Vague, to be sure.  But encouraging employees to commit crimes -- even vague crimes -- is something which HR departments (and executives) tend to frown upon.  Especially when it is done so in a publicly visible way.

So how did the executive team and HR department at Microsoft respond to these calls for criminal activity (towards individuals based entirely on race and gender... which veers into "Hate Crime" territory)?

They rewarded him with a promotion.  They made him a Vice President.

What does this tell us?

While this is merely one example of a Microsoft executive... there is quite a lot we can learn from this (when combined with previous revelations) regarding what is going on within the company:

  1. The Human Resources department within Microsoft either actively supports (at least some instances of) encouraging criminal activity... or they are unable (or unwilling) to stop them.
  2. The executive team at Microsoft must, at least in part, support the idea of employees committing criminal acts.
  3. As of this moment, Microsoft executives and management have actively encouraged Microsoft employees to commit:
    1. "Hate crimes" against White people or men.
    2. Criminal "Gender Affirming Care" of minors (as young as 3 years old).

These are extreme actions, on the part of Microsoft.  A pattern of encouraging employees to commit felonies (an act which, itself, is criminal).  All well documented and irrefutable.

What's more, this provides a glimpse of the upper management organization within Microsoft: including their priorities... and determination push certain causes (including "Diversity" and sex changes for toddlers).

In the words of one Microsoft Vice President, "How far are you willing to take this?  You know what I mean?"

The Lunduke Journal has repeatedly reached out to representatives of Microsoft for comment.

To date, no response has been given.

If Microsoft responds, The Lunduke Journal will publish that response in full.


The Lunduke Journal will continue publishing material -- including additional leaks from whistleblowers -- exposing the actions of Microsoft (and other Big Tech firms).

If you work for Microsoft (or another Tech firm), and have inside information that you feel should be shared with the public, here are instructions on how to anonymously become a whistleblower. The Lunduke Journal will always keep your identity confidential.

Thank you to the whistleblowers who have come forward so far -- and thank you to the supporters of The Lunduke Journal for making this work possible.

Interested in other recent Big Tech leaks?  Check out The Lunduke Journal's exclusive leaks from within Red Hat and IBM.

community logo
Join the Lunduke Community
To read more articles like this, sign up and join my community today
0
What else you may like…
Videos
Podcasts
Posts
Articles
The War for Linux

Widespread discrimination based on Ethnicity, Religion, & Politics across the Linux World. Red Hat, IBM, The Linux Foundation, GNOME, elementary, Linux Mint, and more are involved -- bullies working to exclude those they don't like.

They are at war against the very soul of the Linux and Open Source world.

This is the first part in a series of shows and articles. I'm going after these bullies.

01:05:29
On the Z-80 Holborn Computers

Remembering the (very) funky Holborn computers of the early 1980s.

The full article: https://lunduke.locals.com/post/5588902/1950s-sci-fi-style-computers-powered-by-a-z80-built-in-holland

00:14:04
On The History of Screensavers: 1961 - 1990

From Sci-Fi novels and Atari... to old Macs and Flying Toasters.

The full article: https://lunduke.locals.com/post/5588984/the-definitive-history-of-screensavers-1961-1990

00:18:01
November 22, 2023
The futility of Ad-Blockers

Ads are filling the entirety of the Web -- websites, podcasts, YouTube videos, etc. -- at an increasing rate. Prices for those ad placements are plummeting. Consumers are desperate to use ad-blockers to make the web palatable. Google (and others) are desperate to break and block ad-blockers. All of which results in... more ads and lower pay for creators.

It's a fascinatingly annoying cycle. And there's only one viable way out of it.

Looking for the Podcast RSS feed or other links? Check here:
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

The futility of Ad-Blockers
November 21, 2023
openSUSE says "No Lunduke allowed!"

Those in power with openSUSE make it clear they will not allow me anywhere near anything related to the openSUSE project. Ever. For any reason.

Well, that settles that, then! Guess I won't be contributing to openSUSE! 🤣

Looking for the Podcast RSS feed or other links?
https://lunduke.locals.com/post/4619051/lunduke-journal-link-central-tm

Give the gift of The Lunduke Journal:
https://lunduke.locals.com/post/4898317/give-the-gift-of-the-lunduke-journal

openSUSE says "No Lunduke allowed!"
September 13, 2023
"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044

This episode is free for all to enjoy and share.

Be sure to subscribe here at Lunduke.Locals.com to get all shows & articles (including interviews with other amazing nerds).

"Andreas Kling creator of Serenity OS & Ladybird Web Browser" - Lunduke’s Big Tech Show - September 13th, 2023 - Ep 044

Buckle up...

Not really nerdy, but something I found historically interesting. Stopped at a Walmart that’s still in the 2000s-2010s design. It’s getting a remodel to be in line with the new layouts but getting out of my car I noticed the paint was peeling off a part of the wall outside revealing the old stripes from the 90s. Not sure why, but I find that kind of thing cool.

post photo preview

Fireship media predicts the name of the next macbook chip.

post photo preview
post photo preview
Atari Coin Executive -- The Open Source Video Game Arcade management system... from 1982
Powered by an Atari 800. Plus a handheld 6507 computer. And, not kidding, it really was open source.

1982 was a big year for Atari video arcades — with the release of such classics as Gravatar, Millipede, and Space Duel (complimenting the already massive number of popular Atari games filling video game arcades).

In order to make the management (and, primarily, the accounting) of video game arcades easier — and more future-y — Atari developed and released the “Atari Coin Executive”.

And it is incredibly cool.

I wouldn't mind having that desk.

The central brain of the Atari Coin Executive was an Atari 800 computer (with 48k of RAM) with a number of accessories, including:

  • 2 x Atari 810 Disk Drives

  • An Atari 850 Interface Module (which added RS232)

  • An Atari 825 printer

  • An Amdek 13 inch color monitor

The Atari 800. Ain’t she pretty?

How the Atari Coin Executive worked was both simple… and, at the same time, incredibly cool.

I kinda want to setup an arcade... just so I can use the Atari Coin Executive.

The basic process:

  1. A “Coin Monitor” was installed in the coin slot of every arcade game.

  2. Each Coin Monitor is connected back to the Atari Coin Executive workstation (that Atari 800) via “telephone type wiring”.

  3. The arcade manager can then use that Atari 800 to see how much each game is earning.

Screenshot of the Coin Executive main menu

Fun fact: The Atari Coin Executive software was open source and written in a combination of BASIC and Assembly. Or, as Atari put it in 1982: “In Basic and 6502 Assembler - Source listings and manual supplied”.  You can find images of the Atari Coin Executive software over on the AtariAge Forum.

In addition to the above mentioned setup, the Atari Coin Executive also included a handheld computer called the “Data Recorder”.

It's a 1982 Atari handheld!  Sort of!

The “Atari Coin Executive Data Recorder” was powered by a MOS 6507 CPU with 16K of RAM (8 2k chips), and communicated with the Atari Coin Executive computer via 300 baud serial. It even had a small built-in printer.

This allowed people to manage several arcades, in separate locations, by:

  1. Plugging the Data Recorder into each arcade machine equipped with a Coin Monitor.

  2. Then taking the Data Recorder back to the Coin Executive computer and downloading the data into the Coin Executive software.

Finally, here’s a color picture of the whole setup — including the custom desk which was used for the Coin Executive.

Fern not included.
Read full Article
post photo preview
The 1948 precursor to the hard disk
A brass rotating, magnetic drum... inspired by a voice dictation machine.

Floppy disks. Zip disks. Hard disks.

These sorts of spinning, magnetic storage mediums have been critical to several decades of computers. It’s almost hard to imagine the computers of the 1970s, 80s, and 90s without floppy and hard disks (and other magnetic drives).

But how, exactly, did they come into existence?

Let’s take a quick look at the very first of such devices… and their inspiration.

1946 - The Mail-a-Voice

We’ll begin our journey with the 1946 release of the Brush Mail-a-Voice.

The Mail-a-Voice.  Be Honest.  You want one.

A truly fascinating device, the Mail-a-Voice looked like a phonograph… except it used a paper disc that was coated in magnetic material. You could then record up to 3 minutes of audio on a single paper disk (which would spin at 20 rotations per minute)… and then fold the paper disc up and mail it to someone inside a standard envelope.

Thus the “Mail-a-Voice”.

This device didn’t store computer data itself -- it was only for audio -- but it did inspire engineers who were working on cheap data storage for computers…

February, 1948 - Andrew Booth’s Spinning Drum

In a 1947 trip to the USA, Andrew Booth (who was working on his own computer design), had the chance to see the “Mail-a-Voice” in action.

Since Booth needed a good, inexpensive storage medium for his computer… he attempted to build a similar device using a flat, paper, magnetic disk. What was, essentially, a first attempt at what we would now call a “Floppy Disk”.

Unfortunately, it didn’t quite work. Booth found that he needed to spin the paper disk quite a bit faster in order to make it viable as a data storage mechanism… and he had a hard time keeping the paper disk flat.

In fact, as Booth upped the RPM to 3,000 — which is what he determined he needed — the paper disk itself started to disintegrate. Booth would later comment:

“I suppose I really invented the floppy disc, it was a real flop!”

So he abandoned that approach and, instead, decided to use a brass drum. Why brass? Because brass is a bit less likely to disintegrate than… paper.

It's brass, baby!  BRASS!

This system worked. His brass, rotating drum (with a magnetic coating), had a 2 inch diameter and could store 10 bits per inch.

Yeah. 10 bits. Per inch.

Not exactly massive amounts of storage. But, hey, it was a start! And it didn’t disintegrate! Huzzah!

Improving the magnetic drum

With the first prototype working, Booth set about improving his magnetic, rotating drum storage device. The final version ended up being able to store 256 words of either 20 or 21 bits each (different sources cite different values here and there does not appear to be consensus on if it was 20 or 21 bit words).

In modern terms: This would be equivalent to roughly 5 kilobits of data storage.  Give or take.

This storage drum was put to use on the ARC (the Automatic Relay Computer).

Booth working on the Automatic Relay Computer.

When all was said and done, the ARC could utilize that storage drum to handle 50 numbers and could load a program consisting of 300 individual instructions.

It wasn't exactly a “Hard Disk Drive”… more of a “Hard Drum Drive”.

Either way… Pretty darn cool for the 1940s.

1956 - The First “Hard Disk Drive”

Over the years that followed, this idea was refined and improved. The rotating drum was abandoned for hard, magnetic platters — ones sturdy enough to handle much higher RPMs (certainly much sturdier than paper!)... and thus leading to faster data access.

These improvements eventually leading to the 1956 release of the IBM Model 350 Disk Storage Unit for the IBM 305 RAMAC computer.

IBM Model 350 Disk Storage Unit

The Model 350 Hard Disk Drive, in a base configuration, could store roughly 3.75 MB — all contained in a cabinet 5 feet long, 2 1/2 feet deep, and 5 1/2 feet tall — with platters spinning at 1,200 RPM.

And all thanks to a voice dictation device built for mailing 3 minutes of audio on a folded-up piece of paper.

Read full Article
post photo preview
Who (really) created the "Byte"?
And what is the REAL definition of term?

Kilobytes (KB). Megabytes (MB). Gigabytes (GB).

We use these storage measurements every single, gosh-darned day. And most of us feel like we know exactly what they mean. But do we really?

Do we really — truly — know what a “Byte” is… and its origin? I mean… who came up with the term “Byte”, anyway?

Let’s take a moment to look over the history of the term. If, for no other reason, than to feel smarter than most other nerds.

What is a “Byte”?

If you ask Mr. Google, a Byte is exactly 8 Bits.

Mr. Google wouldn't lie... right?

Ok. Great. 8 Bits = 1 Byte.

So what is a Bit?

That part is simple.

A Bit is the smallest unit of information for a digital computer. A Bit can have two possible values… 0 or 1. It is a boolean. A binary.

Many people believe “Bit” is short for “Bite”. You find this in many computer history books. This little tidbit has been repeated so often, many believe it. However, like many such oft-repeated anecdotes in computing… it’s hogwash.

In fact, “Bit” is an acronym for “Binary Information Digit”. Squish that phrase together and you get “Bit”.

Fun factoids about the origin of the “Bit”

The first usage of the word “bit”, when talking about a type of data in reference to computing, was by Vannevar Bush. He published an articled entitled “Instrumental Analysis” in the October, 1936 issue of “American Mathematical Society”. In it he used the phrase “bits of information” when talking about punch cards.

However 

“Bit” was commonly used in Middle English to refer to “a mouthful” or a “morsel” of food. (This is the origin of why many believe “Bit” is short for “Bite”… even though it isn’t.) As such, Vannevar Bush may not have actually been thinking about a “Bit” as a “Binary digit”… instead he may simply have thought “this is a morsel of data”. Also worth noting… Bush never actually defines what a “bit” is. Making it likely that he was simply using the word “bit” in the Middle English way.

The first — distinctly verifiable — usage of “Bit” in this way is by John Tukey. From “A Mathematical Theory of Communication” written by C. E. Shannon in 1949:

“The choice of a logarithmic base corresponds to the choice of a unit for measuring information. If the base 2 is used the resulting units may be called binary digits, or more briefly bits, a word suggested by J. W. Tukey. A device with two stable positions, such as a relay or a flip-flop circuit, can store one bit of information.”

There you have it. More information about the origin of the term “bit” than you ever wanted to know.

You’re welcome.

Ok. Great.

So, in short, a Bit is a 0 or 1. And a Byte is a group of 8 Bits. Easy.

Not so fast there, sport!

While the Byte being 8 Bits is commonly accepted today… that was not always the case. Not by a long shot!

In fact, there are two competing stories for who created the term “Byte”… and neither of them were referring to a set of 8 Bits!

Seriously!

Werner Buchholz’s 6 Bits

The most often cited creator of the term “Byte” is Werner Buchholz — who used the term, in 1956, to refer to a grouping of 6 Bits when working on the IBM Stretch Super computer.

Man sitting at IBM Stretch console. Image source: computer-history.info.

A “6 Bit” Byte was common in those days. In fact, Braille was a 6 Bit encoding of characters for the blind. And many of the early computers (from IBM and others) used 6 Bit groupings to encode character data.

6 Bits -- not 8 Bits -- per Byte.

However (you knew there had to be a “however”)…

Louis G. Dooley’s N Bits

Around that same time (1956 or so), Louis Dooley first used the word “Byte” to refer to an undefined grouping of “Bits”. But, typically, used as “4 Bits”.

That's right.  Not 8 Bits.  Not 6 Bits.  But 4 Bits.

Dooley published the following letter in BYTE magazine:

“I would like to get the following on record: The word byte was coined around 1956 to 1957 at MIT Lincoln Laboratories within a project called SAGE (the North American Air Defense System), which was jointly developed by Rand, Lincoln Labs, and IBM. In that era, computer memory structure was already defined in terms of word size. A word consisted of x number of bits; a bit represented a binary notational position in a word. Operations typically operated on all the bits in the full word.

 

We coined the word byte to refer to a logical set of bits less than a full word size. At that time, it was not defined specifically as x bits but typically referred to as a set of 4 bits, as that was the size of most of our coded data items. Shortly afterward, I went on to other responsibilities that removed me from SAGE. After having spent many years in Asia, I returned to the U.S. and was bemused to find out that the word byte was being used in the new microcomputer technology to refer to the basic addressable memory unit.

 

Louis G. Dooley
Ocala, FL”

So… what the heck is a “Byte”?!

That’s right. We now have two very, very different definitions for the word “Byte”. Both creations of the word happened independently… and at almost the exact same moment in time.

  • The Buchholz Byte” - A grouping of 6 Bits.
  • The Dooley Byte” - A grouping of an undefined number of bits, less than a full word size. Often used to describe 4 Bits.

You’ll note that neither of these definitions — from the men who created the term — have the number “8” in them.

The shift towards 8 Bits per Byte started to happen in the 1970s… with the development and gaining popularity of 8-Bit processors, such as the legendary Intel 8008.

A revision of the Intel 8008 CPU

Interestingly, some of those early 8-Bit CPU’s had specific functions for handling 4-Bit chunks of data. Because, up until that point, 4 and 6-Bit “Bytes” were incredibly common (including in the predecessor to the Intel 8008… the 4-Bit Intel 4004).

Fun Factoid: Nowadays a 4-Bit group is called a “Nibble”. Which is adorable.

For quite some time the term “octet” or “octad” was used to denote 8 Bit groups. At some point along the way, most people phased that out as well… simply referring to all “groups of bits” as a “Byte”. Though you will still find “octet” used here and there, especially when talking about various network protocols.

All of which means…

Dooley invented the modern “Byte”… not Buchholz

While many writers, enthusiasts, and computer historians are quick to say that Werner Buchholz coined the term “Byte”… they are obviously mistaken.

Besides the fact that it’s hard to discern who (Dooley or Buchholz) actually used the term first… the Buchholz definition is no longer used at all in modern computing.

The Buchholz definition is specific. 6 Bits. Which modern computing has determined is not the amount of Bits in a modern Byte.

The Dooley definition, on the other hand, allows for wiggle room. Which means that an 8 Bit “Byte” would fit the Dooley definition. But not the Buchholz.

The facts are clear: Louis G. Dooley created the word “Byte”. At least as it has been used for the last 40+ years.

But Buchholz — an absolute legend in the computing world — gets one heck of an Honorable Mention trophy.

Read full Article
See More
Available on mobile and TV devices
google store google store app store app store
google store google store app tv store app tv store amazon store amazon store roku store roku store
Powered by Locals