Chris X Edwards

--------------------------

My Crazy Microsoft Assertion

2016-08-18 09:15

I have previously written about Microsoft’s excellent transformation into a respectable member of our technological civilization. The trend continues with the news that "Microsoft Is Open-Sourcing PowerShell for Linux, Macs". Once again, I congratulate the Nadella Dynasty Microsoft for being intelligent and absolutely doing the right thing. Good job!

The technical details of Microsoft’s latest display of humility are not something I need to elaborate on. Today’s news provides a good perspective for understanding a radical assertion I’ve made now for at least a decade.

Microsoft has retarded computer technology by 20 years.

Because this is so counterintuitive to most people and the explanation is complex, I’ll try to make my case for this idea so that I can avoid long discussions on the topic in the future which, frankly, I’m as tired of as my regular readers must be.

One reason my premise is so shocking is that most people believe the complete opposite. I think this has a lot to do with the cult of Microsoft’s original Khan. When I hear Bill Gates praised as a great personage I cringe. Bank robbers can be philanthropists too, often on the advice of their tax accountants. I think Bill Gates needs to be judged on what he did to get the money, not how he’s spending it.

This completely typical assessment of Bill Gates lists the following "five greatest achievements".

  1. Inspiring the era of the home computer

  2. Commercializing the operating system

  3. Windows

  4. Becoming the richest man in the world

  5. Giving his money away

Although I consider it more cancerous than praiseworthy, number 4 is correct enough. But with Gates still the richest man in the world today by far, number 5 looks a bit dubious.

I am horrified by the shockingly entrenched but false idea of number 1. In my opinion Bill Gates inspired nothing having to do with home computers. If you want the person who inspired the era of home computing, Steve Wozniak owns that like Gates owns the Forbes list of billionaires. Even Steve Jobs deserves way more credit. Woz demonstrated that home computers were technically possible and Jobs correctly sensed that the devastating social stigma associated with them could be scraped off in time. I don’t even give Gates credit for "inspiring the era of the office computer". Clearly that was IBM’s inspiration all along.

It is true that Bill Gates was at the helm when Microsoft produced Windows. But is this genius? Or even benign? Despite no preconceived bias at the time, I found Windows (3.1) to be a complete usability nightmare. My epiphany came when I realized that with Windows, I couldn’t do extremely basic things that I could do 15 years earlier using a Wozniak designed computer. A poignant example for me was that Windows lacked a native (to the OS) way to display graphical images. The list of frustrations was long and with Windows95 the list was enlarged as much as it was reduced. But does Microsoft Windows really explain why computers are advanced now in a way that they otherwise wouldn’t have been? As much as I hate 2-D hokey GUI metaphors based on office equipment, it certainly isn’t right to let Microsoft have any credit for pioneering them. Those terrible anachronistic metaphors are office related because they came from an office equipment company, Xerox. Note that Alan Kay and Xerox created their Alto GUI computer 19 years before Windows 3.1 was released. See what I mean about 20 years? As with almost everything Microsoft did, their GUI OS interface was not some kind of inspired protean innovation. It may be remembered as such because it was ultimately so dominant and destructive to other potential innovations.

The really important item on this list of Bill Gates' alleged accomplishments is the commercialization of the operating system. But "commercialization" is a bit too euphemistic. Red Hat and others have commercialized a free and public domain operating system but that business model hardly allows all competition to be utterly obliterated. What Microsoft did that was truly historic, and this returns to my original premise, is that they seized almost total control over how humanity created and exchanged information. Whether Microsoft understood that themselves or not, they have certainly come much closer to achieving that goal than any entity in history. Computers have become the dominant communication tool of our species. With exclusive unilateral control of and access to the system that manages the computer itself, Microsoft came dangerously close to ruling all information.

And this was bad. I tend to agree with Lord Acton who believed, "Despotic power is always accompanied by corruption of morality." Microsoft may have thought of itself as a benevolent dictator and with the best intentions, but after some serendipitous success, once network effects had eliminated their competition, I feel they focused primarily on stifling positive innovation that could have threatened their dominance. And they were good at that. An example is that there used to be dozens of word processors in a formerly competitive market. The way humanity luckily escaped this oppression was by moving most business from the direct control of the OS to the mediated control of the web browser. This was strongly opposed by Microsoft at the time.

Not only was Microsoft working hard to limit computing choices but those choices were degenerating too. Since every person who needed to communicate would need a computer and every person who needed a computer would need a Microsoft OS, the main focus of Microsoft that might generously be seen as prosocial involved introducing more ordinary people to computers. Since they didn’t have to compete with any better approaches they decisively moved toward the lowest common denominator. As my computer agenda became more technical and serious, Microsoft was doing everything possible to infantilize computer use. In that they definitely succeeded.

It is for these reasons that I believe Microsoft has set us back twenty years. The reason I’m writing this today is to justify the full two decades. OS X was released 15 years ago. I consider this a milestone indicating Apple’s return to competent computing and a competitive marketplace. OS X has always had Unix. There was a choice of shells (the original default was tcsh), SSH client/server, and all the wholesome Unix tools that every sane system should include. Microsoft is just announcing that they’ve seen the light and, 15 years later, they’re working on catching up. Maybe in a couple of years they will. That still leaves a couple of years. Well, OS X may have demonstrated that a mainstream popular OS could be competent 15 years ago, but Linux was comfortably doing it many years earlier. I’m standing by my time frame.

Calculus Is A Weird Anachronism

2016-08-09 19:18

Here’s a parody of a calculus problem for you.

dQ/dt = du/dt - di/dt + M

I don’t know how to solve it but I know enough to know it’s not really a proper calculus problem. In this equation Q is quality of life, u is the utility of calculus, and i is the investment one makes in developing a calculus proficiency sufficient for u. M is the intrinsic motivation to learn and be knowledgeable about calculus; mine is used up! Although this equation is quite silly it parallels all real world textbook problems involving calculus by distorting the situation into an absurd simplification.

I consider this equation a very dubious justification for the extraordinary emphasis placed on calculus in the educational system I was (and still am) a part of. For me, calculus was no small investment (i). Between high school calculus and a university engineering degree, I was studying calculus for about 3 solid years. And although I don’t have a typical engineering career, I am horrified that I have put my calculus training to good practical use exactly zero times in my life (u). The reason for this, perhaps the reason I intuitively let myself not be as proficient at calculus as possible, is that I believe that calculus is never essential if you have access to a computer. And if you do have access to a computer (which is everybody reading this), calculus is actually irritatingly counter-intuitive because it implies some wrongish things about how to best model the world (an assumption of analog, for example). I’m not talking about using a computer instead of calculus to "get answers" the way a pocket calculator (app?) can do basic arithmetic. A pocket calculator does not replace the need to understand arithmetic but numerical solutions with a computer do obviate the need to understand calculus.

Hokey religions and ancient weapons are no match for a good blaster.

Apologists argue that calculus may not be super useful but that the equation above is close enough that a radical restructuring of education wouldn’t be worth a mere quibble. The problem with this equation, however, is that it is missing a very important term. Here’s the better version.

dQ/dt = du/dt - di/dt - dc/dt + M

Here c represents opportunity costs. In less economic terms, critics of calculus must answer the question, "What should we be teaching/learning instead?" There is no argument that calculus is useful. It is, just not very. This implies there are better ways to spend our time. There are, many. Besides potential engineers and physicists, I can’t think of any reason for high school students to learn calculus that is better than the reasons to learn the following things.

  • Vocational (machining, welding, plastics, ag) - Even if you aren’t going to fix your own car or work in a factory, understanding the foundations of the most real parts of our civilization can’t possibly be a complete waste of time. If I had to relinquish either my machine shop apprenticeship or my university engineering education, I would jettison the latter.

  • Language History - This is commonly called "Latin", but really learning some Old English and Latin and how English became to be like it is (throw in King James, Shakespeare, etc) turns out to be extremely useful in building solid communication skills. Many foreign speaking cultures now teach their kids English well enough to smoothly participate in the Anglosphere. Language history would help English speakers preserve an edge. For the same reason, I would mention enhanced grammar study as being much more useful than calculus but I understand that, unlike the colorful history of language, it’s only slightly more interesting.

  • Art - Did you know that the BLS predicts that by 2024 "Arts, design, entertainment, sports, and media occupations" will increase by 4.1% (source)? Despite wasting so much of my life learning calculus I have enough math sense to notice that because the percent increase in total occupations is 6.5%, this is actually a per capita net loss. So we really should be focusing on engineering, right? Uh… No. That is projected to grow at only 2.7%. That’s 50% more of a reason to support the arts over engineering. The US is a world leader in design, fashion, fine art, graphic art, digital art, performing art, movies, video games, photography, cuisine, typography, and sports. All despite calculus. We could reinvest in our waning art culture or leave it to other cultures to take over.

  • Music - And if American art has been a grand success, America has been to music what the 13th century Mongols were to Asia. Calculus is probably best learned well into adulthood if the need arises. Music is best learned when young. We neglect our true cultural legacy at our peril.

  • Home Economics - Are our real problems today a lack of people who can derive formulae for ballistics trajectories using 18th century techniques? Or is it that as a species we’re all becoming depressingly unhealthy and fat? Empowering high scholars to make better decisions about food would surely pay society back far more than calculus. Or maybe offer kids the option of an hour of running or an hour of calculus lessons - obesity epidemic cured!

  • Personal Finance - How about helping hapless high school seniors out with the facts of life about debt before they take on those predatory student loans. And credit card debt, payday loans, adjustable rate mortgages, etc. To give them calculus instead is a shameful dirty trick.

  • Statistics - Proponents claim calculus is good mental exercise for later skills in technical fields that are essential. They also say the history of calculus is important for understanding modern technology. That’s fine, but the warm up doesn’t also have to be useless and disorienting. I have a lot of problems with statistics as it’s normally taught (I had almost 2 solid years of that), but even if it’s all completely bogus, it’s still topical and essential for engaged discourse. Who knows, maybe if we treat it seriously we’ll produce an Einstein type figure who will revolutionize the field and create a paradigm shift appropriate for the modern uses (e.g. quantum physics) which Bernoulli, Laplace, Gauss, and other early pioneers had no intention of contributing to.

  • Linear Algebra - "Think of the engineers!" cry the calculus apologists. Surely they need all this "math" just for good practice and it just might be useful. Bollocks. If we care about that, linear algebra is the way to go. If you’re an engineer in 2016 and you’re using Newtonian calculus way more than linear algebra, you’re doing it wrong. In fact, if you’re doing something serious and you’re not using linear algebra to do your Newtonian calculus you’re probably doing that wrong. Linear algebra has the delightful bonus property that it teaches itself to many adolescents (and adults) - "Hey, who’d like to make a 3d video game?"

  • Numerical Analysis - Less interesting than linear algebra but way more useful than calculus is numerical analysis. If we’re compulsively fetishizing the cult of personality of Sir Isaac then numerical analysis is ideal.

  • Information Theory - "No, no, we need calculus because $BetterAlternative is too easy." If this is your feeling, that high school kids need to be tormented with a weird subject that is incomprehensible in any way to normal people, then information theory is ironically hard to communicate to students. Honestly, it’s not a great idea to add information theory to a standard high school curriculum but it would be much better (useful/interesting) than calculus! If we could just broadly teach people that "password" is not a good password, it’d be a good trade.

  • Computer Science/Programming - It may not have been obvious in 1953 when the first transistorized computers were built that we, as a society, should immediately scrap calculus and instead focus on these miraculous new tools. But it is obvious now! You may never need to know how a linked list works, but if you’re reading this, you’re using a profusion of them right now. And that’s esoteric computer science mumbo jumbo that may or may not be useful. The starkly obvious real-world potential utility of computers that goes untapped because of a lack of education is pathological.

  • Philosophy/Ethics - A common justification for calculus (here’s one) is that it helps "teach people to think", including logic, problem solving, etc. I believe that if we have to disguise the study of philosophy as calculus (Newton and Leibniz did the reverse by the way) then that itself is terrible philosophy and proof that we have a problem. Just teach philosophy! It’s worth it!

Just as we have stopped beating kids with the lash (not sure about Texas), sometimes a society needs to accept that it’s on the wrong track and abandon the cultural script that mindlessly proscribes suboptimal practices. Although there is gathering momentum for calculus reform, by speaking out against this educational hazing I’m doing what I can to break with our obsolete past. In an ideal world everyone would learn calculus - right after the thousand other worthier subjects.

Review: Misbehaving

2016-07-15 16:57

Review of the book "Misbehaving - The Making of Behavioral Economics" by Richard H. Thaler.

I have complex feelings about the field of economics. Note that I didn’t say the "science of" economics. I pretty clearly don’t believe that word is appropriate, "dismal" or not. Formal study of economics was a requirement for my university degree and since those innocent beginnings I’ve had serious philosophical doubts about the field. I remember thinking at the time that I should be able to convert the knowledge I was receiving in an economics class directly into cash. Either that or what exactly is the point? I have taken engineering classes which I felt I could indeed convert directly into a conceptual design for a reasonably safe bridge. I don’t feel this way about economics at all. At its best I feel there may be elements of truth in the field’s orthodoxy but to me it mostly seems like astrology which also contains elements of truth and also has not made me a penny richer.

I remember sitting in those economics classes looking at supply/demand curves etc. and getting an unsettling sense that they were tacitly doing something the culture of physics is quite honest about; they were talking about spherical cows in a vacuum. I think that physicists don’t mind this because such simplifications often lead to a deeper understanding of the situation. Physicists gloat over the fact that real cows falling through the earth’s dense atmosphere can actually be approximated by an amalgamation of spherical cows in a vacuum. This sort of thing is even true for computer science too.

Unfortunately for economists who styled their art after such thinking, this is not how economics works. Useful economics must primarily ask questions like, "Why the hell would someone throw a cow from an airplane?" Only in the last few decades has economics started to turn to such questions. It’s been a radical transformation. So radical that economics has bifurcated into "economics" and "behavioral economics". I believe it’s only a matter of time before "behavioral economics" changes its name to "economics" and the old economics is forgotten like the old geology that couldn’t acknowledge plate tectonics for half a century despite overwhelming evidence.

This book is basically the inside story of this transformation from the perspective of one of its leading revolutionaries. Thaler does seem about as well qualified as anyone to tell the tale. First, not only can he appreciate that real human beings are generally the participants in economic activity, he also knows that real humans do not like reading the kind of prose normally written by economics professors.

Richard Thaler does also seem to be one of the most important economists to the field of behavioral economics. If any one person created the field it would be him. (Of course I just read a history of the subject written by him so maybe I’m biased.) The only people perhaps more important to the field are Tversky and Khaneman who, despite winning a Nobel prize in economics, are ostensibly and revealingly not economists.

The book is a superb introduction to behavioral economics. Beyond the interesting history of this new field, it contains many fascinating insights into how markets behave. Or don’t. It contrasts the new thinking with old ideas like the "efficient market hypothesis" of traditional economics. Here is Thaler and EMH’s creator, Eugene Fama, having an excellent discussion about it. The EMH basically says if you see $10 lying on the ground it must be an optical illusion because if it were real, someone would have already picked it up. I am sympathetic to the idea that beating the market (i.e. reliably finding unclaimed money) is extremely hard and perhaps impossible. But I believe that if it’s possible or impossible, the reasons mostly involve the problems of irrational thinking. It’s difficult for me to conceive of how a steaming load of irrational thinking can create efficient markets (as Fama allows may be possible).

I think what I like about this book is that it really nicely chronicles the field of economics going through a serious philosophical rethink. I am more interested in the philosophy of the topic and I see many parallels with my main philosophical interest, probability. Not only are humans easily tricked into believing in obvious incorrect future outcomes, but how we even think about the concept of predicting future outcomes is extremely tenuous. This business about economics (behavioral or Newtonian) lacking predictive power is obviously important. It may be that economics can be defined as: the philosophy of looking at the world in a way that if a reliable rule to predict the future is discovered, the rules instantly change. I have to say I lean that way, but, obviously not entirely. If I did, it would be utterly pointless to even think about economics.

Microsoft Linux 2

2016-07-07 09:05

A while back, I pointed out that Microsoft had quietly created their own Linux distribution (as an offered VM image in Azure). Although I’m a little late noticing this, it seems Microsoft has doubled down and truly created their own distribution which El Reg rightly calls Microsoft Sonic Debian.

This Linux system is basically a special setup to control switches and network technicalities in their Azure cloud service. It is apparently not available to the public but it is used by Microsoft internally.

I’ve always pointed out the irony of my employer not using the popular and capable operating system it developed. I think Microsoft choosing to not use a Microsoft operating system for its needs goes beyond mere irony. It’s more like the pope using a rhetorical point of Richard Dawkins.

If Microsoft doesn’t automatically think that Windows is the right OS, you probably shouldn’t either.

To Whom It May Concern

2016-07-07 10:26

I’ve previously mentioned that I’m somewhat of a grammar enthusiast. I’m more of a language and communication enthusiast really but this does come with a bit of grammar pedantry, usually just enough to embarrass myself when I reread old things I’ve written.

For future reference, I wanted to clarify my thoughts about a particular relic of English grammar, the crusty old dative pronoun whom. I quite like this word and occasionally enjoy having some fun with it. I personally can not help but mentally copy edit text I hear (Bo Didley - "Whom Do You Love?", ACDC - "Who Made Whom?", Ghostbusters - "Whom you gonna call?", etc…)

But here’s the thing, as those examples illustrate, real English speakers are usually perfectly ok with not using whom "correctly". My personal rules are simple.

  1. If you definitely know what you’re doing and you want to use whom and it is 100% correct, use it.

  2. If you’re writing for a prestigious widely read publication that has a well established history of punctilious grammatical perfection, then learn the rules well enough to apply #1.

  3. If you’re using a set piece that most everyone knows, don’t go mucking that up by changing it. In other words, don’t use something like "to who it may concern" or "for who the bell tolls".

  4. If none of the previous rules apply, don’t worry about whom! Use who always. No one (you care about) will mind.

If you follow these rules you’ll not run afoul of what I do consider a serious English faux pas: using whom when it is not correct. It is always better to err on the side of not using it. To add a superfluous whom says, loudly, that you tried to be a fancy English communicating sophisticate and failed.

Here’s an an example of the usually immaculate Economist violating this dictum.

"However, Brexiteers have jumped on anyone whom they think is exaggerating the economic impact of the decision to Leave."

This type of construction is very tricky and catches otherwise careful writers off guard. The mistake is easy to see if you remove the (almost parenthetical) "they think" clause.

"Brexiteers have jumped on anyone whom is exaggerating the impact."

The sentence could have clumsily been stuffed with a correct whom if it was rearranged into something like this: "However, Brexiteers have jumped on anyone whom they think of as exaggerating the economic impact of the decision to Leave." In that arrangement, it’s the object of a preposition, one of the surest signs that whom is correct (as in the examples of rule 3).

As I said, just be cool about it and if you take the trouble to learn and internalize the whom rules, consider personal grammar editing a fun little hobby that you can (usually!) keep to yourself. Otherwise treat it like quantum physics, i.e. assume that some people must know how it all works but you don’t have to.

WhoRescuedWho

--------------------------

For older posts and RSS feed see the blog archives.
Chris X Edwards © 1999-2016