Wednesday, October 29, 2014

Thoughts on Success/Failure

Consider a continuum of things that you can do, divided up by difficulty. You can have 3 kinds of results
1. For some part of it, you will always succeed,
2. For some part of it, it's a mix of sucess and failure, and
3. For some even more difficult part, you always fail.
Now in order to succeed at something consistently, you need to master it, which also means that the things you always succeed at are fully mastered. Any knowledge or skills to be gained in those areas are at best minuscule improvements.
Then there's the part where you succeed sometimes and fail sometimes. But each time you fail, you gain more data on how you failed, and you think about how you can fix those points. You're learning, and you're improving.
Now there are also tasks at which you'll consistently fail. Passing the BAR exam as a programmer with no preparation would be a good example. Since you have no sucesses to compare to in that area, you wouldn't learn a lot trying to retake the BAR exam, no matter how often you did it. You're lacking in knowledge and skills required, and trying to perform at that level doesn't do you any good.
Now the ratio of failures and sucesses is more of a rule of thumb, and not an ironclad model. The reason that the common adage is "If you're failing, you're doing something right", at least so I believe, is that "If you're failing some and succeeding some, you're closer to your optimum failure/success ratio for growth than if you are at either end of the extreme" simply doesn't roll off the tongue quite as nicely. You can't put that on the cover of a self-help book. And since almost everyone is inclined to default to the "sucess" side of the continuum (coloquially known as your comfort zone) instead of the "failure" side, it makes a lot more sense to tell someone to fail more often. It's not a curse, it's a call for more ambitious projects. In this specific case, it's a comforting call to the fact that they're trying hard enough to fail at something. And I think that is commendable.

Risk of failure should be weighed against the consequences of failure.
If I'm writing code for robots as a hobby and my robots behave exactly as I intended all of the time, then I'm probably not learning anything, and I should try to make the robots do more sophisticated tasks. The consequences of failure are minimal, so the optimum failure rate is high.
If I'm at work writing avionics code, the cost of failure is astronomical. It's nice to push boundaries and learn things, but it's better to avoid plane crashes. The consequences of failure are high, so the optimum failure rate is low.

Friday, October 03, 2014

Nice quote


    闻不若闻之,      Not having heard is not as good as having heard,
    闻之不若见之,      having heard is not as good as having seen,
    见之不若知之,      having seen is not as good as mentally knowing,
    知之不若行之;      mentally knowing is not as good as putting into action;
    学至于行之而止矣    true learning is complete only when action has been put forth
           -- 荀子                                    -- Xunzi

Sunday, September 07, 2014

Humor

https://www.facebook.com/video.php?v=1485192401724091&fref=nf

Wednesday, August 27, 2014

Reasons why Math/Statistics losing to ComputerScience

https://news.ycombinator.com/item?id=8228978

From one of the comments:

Here is a math major's perspective (I also have a C.S. master's from Stanford, a decent school for computer science).
The real reason statistics is losing to machine learning is M&M: money and marketing.
Money is a huge factor when kids choose majors in college. Many of them have student loans to pay off, and for many of them, getting a high-paying job post college is a serious consideration. In this regard, statistics is a great major, but computer science is flat-out ridiculous. When big tech companies are offering a Stanford graduate with only a few summers of programming internships under the belt for 150k/year, naturally a lot of kids are lured into computer science.
Then, once they start studying computer science, they discover this thing called machine learning. While I do think there is a difference in emphasis between stats and machine learning, the fundamentals are same, except the nomenclature sounds much cooler in machine learning. Nonparametric inference sounds esoteric and cryptic, but unsupervised learning sounds futuristic and cool.
The biggest problem with both statistics and machine learning education, I would say, is their lack of emphasis on mathematical foundations. When I was in college, CS229 (Introduction to Machine Learning) was touted to be the hardest class at Stanford. Having helped my friends wade through CS229 problems (a lot of which comes down to wading through linear algebra and multi-variable differential calculus), I do not think this is remotely true: CS229 is hard because it attracts students who do not have the requisite mathematical maturity to learn statistics/machine learning in a serious way (I am not even talking about the real-analytic foundation of probability and calculus but rudimentary linear algebra and chain rules).
Also, I do think CS229@Stanford is a great class that brings theory and practice together =)
As for R/Python/Matlab/etc., I will let the zealots argue what's best. To me, they are like statistics and machine learning: similar with different emphasis.
reply

1971genocide 15 hours ago | link

I agree with your points so much.
I am a applied Maths student (currently) and when looking for internships I noticed 80% of them are for CS related stuff. Imagine being me ( 19 years old ), offered a 15K/year job in my first year in university. My best job up until then was cleaning the floor at a mall. I have studied insane amount of mathematics all my life ( proofs after proofs ), how much longer do I need to wait until I am able to something of value in society ?
Compared that to the amazing 14 weeks internship I did at a startup churning out frontend javascript. Those 14 weeks were the best weeks of my life, I never felt more confident and of value. My depression vanished overnight, I found a new aim in life, I treated coding the same way nations treat nuclear weapons.
I learnt more things of value in those 14 weeks then I did in the previous 19 years, I didn't need to set up an alarm clock to wake up in the morning. There was no hand holding, because I was getting paid. I spent the rest of my waking hour churing code with no need for external motivation.
I got really angry at society for misleading me for such a long time because of how easy it was for me to do something useful, finally. I see my friends struggling with the same problems and feel so much of young ppl's time is wasted, In my mind its a crime.
I think as always mathematicians fail to grasp the human side of the argument, why are they complaining about lack of interest in statistics ? its their own fault this problem exists. There is no Neil Degrasse Tyson for statistics, there is no mark zuckerberg for stats, you don't need to hire a PR department. Just show us the wonderful magic that you speak of and we are smart enough to figure out why we should dedicate our lives to your cause.
This is why if some statistical axiom/theory doesn't have a CS applied part I became really skeptical of doing that course.

Thursday, August 21, 2014

Thursday, August 07, 2014

Timeless stuff

http://www.kalzumeus.com/2011/10/28/dont-call-yourself-a-programmer/

Focus on the company's bottom line and customer satisfaction. That is your job, not making pretty code.
To get a [good] job, you have to literally put your foot in a door; you must be connected to some person who works at that company.
The amount of money you get is based on the size of the company and your perceived worth. Ask for more than you think you're worth.
Make yourself look good. Talk up your cool projects, wax philosophical about your ideal development methodology, then hammer home how you're focused on the customer, efficiency, and the business's bottom line.
Focus on the key elements in any situation, don't get distracted by minutia, be assertive. Your motto is "restrained, confident professionalism."
"people are hilariously easy to hack" in the way you present yourself, so use all the hacks available. Be the James Bond of CRUD apps.
things not mentioned in the post:
Startups will hire you for positions you might not be qualified for and pay you peanuts for it, but it'll create experience/value/resume filler.
Academic achievement is only significant to people who prize intelligence over productivity. Just know how to get the work they need done and show them that.
Networking in person isn't necessary. Just build relationships.
If you're new, add [real] stuff to your resume to make it seem like you have relevant business experience, and be able to back up whatever you put down.
Don't "work your way up" through some corporate ladder. Just apply for the job you want. At some point you'll get it.

Wednesday, July 09, 2014

How to be a great software developer

http://peternixey.com/post/83510597580/how-to-be-a-great-software-developer

Summary:
Start with something ugly but functional and then apply and reapply yourself to that ugly and misshapen solution and refactor it back into its simplest form. Simplicity comes far more reliably from work than from brilliance. It comes more predictably from code written, than from thought expended. It comes from effort.

Name your functions and variables well

....

Tuesday, May 13, 2014

Coding Style

Legibility and long term readability is the most dominating factor out of any coding style. This is really subjective, but saving on characters you have to type, only to make the code a lot harder to read seems like penny wise pound foolish.
Its a lot easier to review because there is no hidden features (eg: operator overloading) that prevent comprehension.

Sunday, March 30, 2014

Wednesday, March 19, 2014

Java: Pass By Value notes

The Java Spec says that everything in java is pass-by-value. There is no such thing as "pass-by-reference" in java.

Dog myDog;
is not a Dog; it's actually a pointer to a Dog.

What that means, is when you have

Dog myDog = new Dog("Rover");
foo(myDog);
you're essentially passing the address of the created Dog object to the foo method.

Suppose the Dog object resides at memory address 42. This means we pass 42 to the method.

If the  foo Method were defined as

public void foo(Dog someDog) {
    someDog.setName("Max");     // AAA
    someDog = new Dog("Fifi");  // BBB
    someDog.setName("Rowlf");   // CCC
}

the parameter someDog is set to the value 42

at line "AAA"
someDog is followed to the Dog it points to (the Dog object at address 42) that Dog (the one at address 42) is asked to change his name to Max
at line "BBB"
a new Dog is created. Let's say he's at address 74 we assign the parameter someDog to 74
at line "CCC"
someDog is followed to the Dog it points to (the Dog object at address 74) that Dog (the one at address 74) is asked to change his name to Rowlf then, we return.

What happens outside the method:

Did myDog change?

Keeping in mind that myDog is a pointer, and not an actual Dog, the answer is NO. myDog still has the value 42; it's still pointing to the original Dog.

It's perfectly valid to follow an address and change what's at the end of it; that does not change the variable, however.

Monday, March 10, 2014

Who is a developer?

A developer. Not a C programmer, not a Rubyist, or any subset of software development. Know how to program, know how to solve problems, and know how to learn new languages, libraries or frameworks. Sticking to your guns in an industry as ever-changing, for better or worse, as Software engineering, is a bad idea.

Wednesday, January 22, 2014

Creditcard ChargeBack vs. Disputing a charge

In order to get a chargeback for a charge that you did not authorize, you have to tell your credit card company that you specifically want a chargeback. Do not tell them that you are disputing the charge. If you do not use the term "chargeback" they will merely contact the vendor who will then advise that the charge is valid. Under Visa/MC rules, if you ask for a chargeback using that language they have to give it to you.

Tuesday, January 07, 2014

Human beings capability

Line from Robert A. Heinlein’s books:
“A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects.”