Number Crunching

Tag: Number Crunching

Let Me Guess — Your PIN is 1234, right?

As easy as 1234… Or 2580!

If a random sample of the population is reading this post, then more than 10% of you are saying “why, yes. Yes it is!”

What about the other 90% of you?  Hmmm, let me see here — How about 1111? 0000? 1212? 7777?

Those next four numbers should account for another 10% of you, so those five account for one in every five (20%) of all four-digit passwords. That is, the numbers that protect your savings and checking accounts, among other things, from would-be interlopers.

Jeepers.

It turns out that only 426 codes account for more than half of all passwords. If passwords were distributed randomly, it would take 5,000, of course. So someone who nicks your debit card has a pretty good chance of cracking your code just by going through a list of “easy to remember” numbers.

How do I know all of this?

The crew over at the Data Genetics blog got a hold of 3.4 million passwords from breached databases, and took a look at the frequency of various numbers in a mind-boggling array of ways. Very cool post.  And, for most of you, probably time to change your password.

For those 0.000744% of you with 8068 — the least common password — it’s probably time to change your password, too.  Once people see that it is the least common, they will pick it and it wont be least common any more. Oh, 2727.

2580 comes in at #22.  Can you see why?

More on Teacher Performance

We recently posted a piece on the controversy surrounding the publication of teacher performance evaluations in Los Angeles.  There are a couple of interesting follow ups circulating in our trusted news sources.  The first piece is from the always contrarian and sometimes cantankerous Jack Shafer of Slate.com.

Nobody but a schoolteacher or a union acolyte could criticize the Los Angeles Times‘ terrific package of stories—complete with searchable database—about teacher performance in the Los Angeles Unified School District.

You probably don’t need to be a communications major to figure out where that piece is headed.

Shafer rightly applauds the LA Times FAQ page on what value-added analysis is, its strenghts and weaknesses, and other, well, FAQs.

A second piece plucked from the New York Times compares the pros and cons of value-added analysis in a more straightforward fashion.  On the one hand:

“If these teachers were measured in a different year, or a different model were used, the rankings might bounce around quite a bit,” said Edward Haertel, a Stanford professor who was a co-author of the report. “People are going to treat these scores as if they were reflections on the effectiveness of the teachers without any appreciation of how unstable they are.”

On the other hand:

William L. Sanders, a senior research manager for a North Carolina company, SAS, that does value-added estimates for districts in North Carolina, Tennessee and other states, said that “if you use rigorous, robust methods and surround them with safeguards, you can reliably distinguish highly effective teachers from average teachers and from ineffective teachers.”

Certainly, the two sides each have a point.  Even as a man of numbers (or perhaps, especially as one), I worry that people put too much faith in a quantitative rating.  That said, it seems the cat has squirmed its way out of the bag on this one, and it is going to be difficult for opponents to get it back in.