Archive for March 2009

Open Source Is The Pinnacle Of The Free Market

March 31, 2009

Though I am not going to advocate Laissez-faire economics, I do want to point out that the open source world is as close as you can get to a pure free market. The reason is because if you make a product in the open source world, anybody is able to study it, modify it, redistribute it and even sell it without many restrictions.

If person A delivers a great product, but person B is able to study the product and make it better in a cheaper way, the free market has done its job. This can happen in the open source world but has a hard time happening in the proprietary one.

Current Patent Laws Prevent True Market Freedom
Take the iPod. Steve Jobs admits the iPod has been fully patented through and through. If somebody has the ability to deliver the same product, just better and cheaper, his/her hands are tied.

We will never know if Apple is the company who delivers the iPod at the greatest value for customers since nobody is legally allowed to try.

Proves Red Hat Is The Best
This is not true of Red Hat. Red Hat makes billions, and if someone was able to take Red Hat, make it better and cheaper in a way that pleases customers there is nothing stopping them.

Guess what, Oracle recently tried and failed. Despite the open possibility, nobody can deliver Red Hat software with as great of value as Red Hat can. This proves Red Hat is truly the best at what they do.

If You Really Provided Your Product At The Best Value You Wouldn’t Need Patent/Proprietary Protection
Look again at Red Hat. They don’t need them. They still make billions. I have a feeling Apple and Microsoft would run scared stiff if you took their patents and proprietary licenses from them for the risk of someone doing it better and cheaper would be extremely high.

Edit: When I said Red Hat makes billions I mean they have made billions over the years. (total)

Fingerprints Shouldn’t Be Used In Court

March 27, 2009

I’m sure everyone has seen some Law and Order or CSI type show where finding a suspect’s fingerprints at a crime scene seals the case.

However, it appears scientists, including members of the National Academy of Sciences are questioning the validity of fingerprints, especially partial fingerprints that have been smudged. From the LA Times:

In 2007, a Maryland judge threw out fingerprint evidence in a death penalty case, calling it “a subjective, untested, unverifiable identification procedure that purports to be infallible.”

The ruling sided with the scientists, law professors and defense lawyers who for a decade had been noting the dearth of research into the reliability of fingerprinting. Their lonely crusade for sound science in the courtroom has often been ignored by the courts, but last month it was endorsed by the prestigious National Academy of Sciences.

The question is not whether fingerprints are unique — most scientists agree they probably are, though that assumption remains largely unstudied. The issue is whether the blurry partial prints often found at crime scenes — what Faulds called “smudges” — are sufficient to identify someone with any reliability.

The answer: No one knows. There are no national standards for declaring a fingerprint “match.” As a result, fingerprint identifications are largely subjective.

I will say if someone has to go to jail for the rest of their lives over evidence with no scientific credibility this is a bad thing. I’m not going to be able to watch these crime shows now without thinking “man, I wish I could be the defense attorney.”

Innovation Sparks Jealousy

March 26, 2009

(Full disclosure: I use, and contribute, to both Fedora and Ubuntu)

I’m finding increased jealousy toward Fedora every day especially when someone points out how well Fedora is innovating.

Case and point, take this article I just read Ubuntu 9.04 vs Fedora 11: A lot can change in one month! The article concludes:

Ubuntu, as usual, has been rock stable for me…

But considering the differences – Fedora 11 seems to be a full 6 months ahead of Ubuntu….

Ubuntu sure has some catching up to do. When Ubuntu 9.10 releases, I can’t even begin to imagine how far ahead Fedora 12 will be!

Now look at the comments:
First, take Inconsiderate Clod:

Fedora is a (stupidly) aggressive development distro which regularly causes major malfunctions to all it’s rawhide users as well as it’s more ‘conservative’ users…  the world should be happy and thankful with all the Fedora users who unwittingly offer themselves up to be ginny pigs for the greater good of FOSS.

Ouch! I guess I am an unwitting Fedora Ginnie pig. Do I hear a little jealousy?

Or maybe RALF:

You sir, are an idiot.

Firefox 3.1 isn’t stable yet. OpenOffice 3.1 isn’t stable yet. Plymouth only works with Intel hardware, Ubuntu too will use gnome-media, Thunderbird 3 isn’t stable yet.

Double standard RALF? Ubuntu releases their LTS release with with Firefox Beta and RALF is complaining that a Fedora Alpha release has some Beta software? Interesting.  Not to mention the incorrect statement about Plymouth coming from someone calling another an idiot.  I found this amusing.

The development of each Ubuntu version lasts 6 months. In those months, they lock down the version and keep fighting bugs until the deadline.

Because, Fedora doesn’t try to fight bugs until their deadline?

That’s why people can actually _use_ Ubuntu. Fedora is more like ‘what’s next?’

Again, by Canonical’s versus Fedora’s own numbers, more people use Fedora then Ubuntu, so this “usability” argument is a little weak.

Personally, I think both Canonical and Fedora deserve praise, not attacks due to jealousy. Canonical has brought Linux to millions of users who arguably needed something like Ubuntu to get started. Likewise, Fedora’s innovation always keeps it a good 6 months ahead of the pack without the luxury of having an upstream distro do the majority of the heavy lifting. For these reasons we need to have more praise and less jealousy.

The Responsibility of Intellectuals

March 23, 2009

I don’t consider myself an intellectual, but perhaps for moral reasons I should.  First, I am interested in many intellectual things.  Second, I have a unique ability to do some intellectual things. (Otherwise, I would not have successfully been admitted to a PhD program for physics.) Third I came across an interesting article named The Responsibility of Intellectuals by Noam Chomsky which has got me thinking.

The article is an anti-war article, but that’s not what is on my mind.  What interests me is the larger image he seems to paint when you read it.

First, I’ll grossly paraphrase the article then “liken it” 🙂 to my own situation.  He, and Albert Einstein says the same thing, points out the one major reason it is so easy for countries to go to war is that governments spread propaganda and intellectuals who know better or should know better remain silent due to political pressure, personal gain, or just being lazy.  He expresses his opinion that if intellectuals would  stick up for the truth there would be much less bloodshed.  Again, Einstein was emphatic about the same point which I saw on a Science Channel documentary the other day.   Einstein saw intellectuals all over Germany fueling the government war fire even though they knew better.  It was all about reaping personal gain.

Now, as I already stated, this isn’t trying to be an anti-war post.  It is more to point out the obvious: atrocities of many sorts can happen if intellectuals remain silent for personal gain, political pressure, or because they are lazy. Intellectuals are in a unique position to bring many truths to the forefront that would prevent many tragedies.  Chomsky feels it is the responsibility of intellectuals to use their talents to enlighten the world so as  to prevent these atrocities.

I agree, and to what extent I can, I will do better.  (Especially in the “promoting good science” realm.)

Krugman

March 23, 2009

Krugman’s posts always worry me. For those who do not know him, he has won the Nobel Prize for economics, is one of the 50 most cited economists of all time, writes for the New York Times, and his blog is considered one of the most influential in the world.

Furthermore, he is traditionally always spot on when it comes to predicting the future of the economy. Take this talk he gave at Google in 2007 and decide for yourself how accurately he described our current state today. It is like he went back from the future.

Unfortunately, he is predicting even further financial catastrophe since he feels the government isn’t being aggressive enough. He feels strongly we need to nationalize the banks and the money the government is investing has got to be even bigger to stop this beast. Read his current post about how he feels the current program won’t be enough. Here’s his recipe for success:

As economic historians can tell you, this is an old story, not that different from dozens of similar crises over the centuries. And there’s a time-honored procedure for dealing with the aftermath of widespread financial failure. It goes like this: the government secures confidence in the system by guaranteeing many (though not necessarily all) bank debts. At the same time, it takes temporary control of truly insolvent banks, in order to clean up their books.
That’s what Sweden did in the early 1990s. It’s also what we ourselves did after the savings and loan debacle of the Reagan years. And there’s no reason we can’t do the same thing now.

But the Obama administration, like the Bush administration, apparently wants an easier way out.

Now, I know some readers will jump on the anti-government intervention/anti-nationalization ideology, blah, blah blah, but here’s a fact: Krugman usually always gets the economy right, and that scares me. (That also means more to me than ideology.)

Finally Learned MPI

March 18, 2009

I have used several codes that use MPI but not written one from scratch. I have only added to codes others have done the MPI work for. To be honest, I had a hard time reading the online documentation. However, a post doc showed me a code (below) that opened opened my eyes to what is going on. I am posting it in case it could be of some help to others:

To compile run:
mpicc testmpi.c

To run type:
mpirun -np 4 ./a.out

The trick is to understand each of the 4 processors runs the whole code but is assigned a rank. As you can see in the for loop and print statement, a conditional on rank causes the processor to know which code is “its” job. First you have to Init and assign a rank and size which comes from the -np 4 passed above. The MPI_Barrier command pauses the program until all the processors have caught up with the others, and the MPI_Reduce command takes all the values each processor has computed and feeds them back to the master processor. As you can see I am taking each sum and “MPI_SUM”ing them together as “MPI_DOUBLE”s and feeding my master processor “0” the sum into the value Sum.

I then make the master processor print the total Sum.

#include
#include “mpi.h”

int main(int argc, char** argv)
{
int rank, size;
int i;
double sum=0, Sum;

MPI_Status status;
MPI_Init(&argc, &argv);
MPI_Comm_rank(MPI_COMM_WORLD, &rank);
MPI_Comm_size(MPI_COMM_WORLD, &size);

for (i = rank*1000000000/size; i < (rank + 1)*1000000000/size; i++) {
sum += 1.0;
}
printf(“%d gets sum = %g\n”,rank,sum);

MPI_Barrier(MPI_COMM_WORLD);
MPI_Reduce(&sum,&Sum,1,MPI_DOUBLE,MPI_SUM,0,MPI_COMM_WORLD);

if (rank == 0)
printf(“Total sum %g\n!”,Sum);

MPI_Finalize();
}

Finally Picked Up My Masters Degree

March 10, 2009

As some readers may know, I am a PhD Candidate in physics at UC Irvine.  However, last September I was awarded a masters degree in physics since I automatically qualified.  Today I finally went into the registrars office to pick it up, so now I have a shiny new diploma to hang on a wall. 🙂  In a few years I will hopefully have another.