TheJach.com

Jach's personal blog

(Largely containing a mind-dump to myselves: past, present, and future)
Current favorite quote: "Supposedly smart people are weirdly ignorant of Bayes' Rule." William B Vogt, 2010

Notes from Probability Theory Chapter 2

Chapter 2, The Quantitative Rules, begins with a quote:
Probability theory is nothing but common sense reduced to calculation. Laplace, 1819.

In my quotes file I have it in the "original?" French: La théorie des probabilités n'est que le bon sens reduit au calcul.--Pierre Simon, Marquis de Laplace

This chapter is devoted to the mathematical deduction of the quantitative rules of inference which follow from the three desiderata of last time. They were 1) representation of degrees of plausibility by real numbers; 2) qualitative correspondence with common sense; 3) consistency.

See Full Post and Comments

Rich fool buys something frivolous, therefore...

Tax the rich! So an annoying motif of progressive and liberal news outlets lately is this. They'll find something they have some issue with and always conclude with "tax the rich". What saves them from being as bad as conservatives is they don't additionally say "you shouldn't be allowed to do it." Here's one example:



They later had an update because apparently the house wasn't up to building codes and was ant/termite infested. They were still hammering the tax the rich point and didn't apologize, though. They were also very skeptical about the excuse, which is fine, it's good to be skeptical of people's proclaimed motives. Here's a more recent video.

See Full Post and Comments

Notes from Probability Theory Chapter 1 continued

Last time I covered some notes on the first section of the first chapter, today we'll go a little further.

1.2 - Analogies with physical theories



A quote that directly precedes this section in its expanded form:
"A mathematician is a person who can find analogies between theorems; a better mathematician is one who can see analogies between proofs and the best mathematician can notice analogies between theories. One can imagine that the ultimate mathematician is one who can see analogies between analogies."
--Stefan Banach


See Full Post and Comments

Output

One of the interesting things about having a blog is that it lets me track some of my own written output over time. I have a super-secret (okay you can access it if you guess the URL) page that displays my total word count to date as well as my average post length and first two longest posts. When I want more data I'll just ssh into the server and do some mysql queries. As of today: I've written 183,498 words over 3 years. Not very much, relatively. I'm still 816,502 words short of my goal of one million words.

Why one million? Part of the reason for starting this blog was to reach one million words in an easily traceable way. My written output didn't start becoming significant until 2003 or so, so there is 6 years of "lost" output I'm not counting, but I figure I can write it off as statistical error in the end. So again, why one million? I believe that everyone's first million written words are crap. If you care about writing you should try and get those million words out of you as fast as you can.

I saw this notion expressed recently but for artists. Lauren Faust mentioned here:
"Look for guidance and art education wherever you can, but the biggest, most important thing of all is to draw, draw, draw and never stop drawing. Imagine you have several thousand crappy drawings you have to get out of your system before your[sic] any good and try your damnedest to get those crappy drawings out as fast as you can."

See Full Post and Comments

An explanation and example of Naive Bayes

Here I embark on a slow, carefree explanation of Naive Bayes, but if you're just interested in code, or the single line of math Naive Bayes takes, then look to the bottom. Naive Bayes is an algorithm to classify things. It's probably most popular for its use in spam classification, but you can use it for pretty much anything else and it's somewhat embarrassing how successful it's been for all these years compared to anything else. Moving away from Naive Bayes to, say, a fully Bayesian system, carries a large computation cost for frequently little benefit on the types of problems Naive Bayes is good at.

Allow me a brief biblical tangent. In Genesis, God created Adam, the first man. God saw it was not good for the man to be alone, so he decided to create a helper for Adam. At the same time, he had Adam name, i.e. classify, all the creatures brought before him:

And out of the ground the LORD God formed every beast of the field, and every fowl of the air; and brought them unto Adam to see what he would call them: and whatsoever Adam called every living creature, that was the name thereof.
And Adam gave names to all cattle, and to the fowl of the air, and to every beast of the field; but for Adam there was not found an help meet for him.
--Genesis 2:19-20(KJV)


See Full Post and Comments