At Slate.com (here), Daniel Engber has a terrific piece on the hackneyed phrase "correlation does not imply causation." He examines the origins of the phrase, its ever-increasing popularity among pop-statisticians, and its real limitations as a device for debunking non-parsimonious conclusions.
In many cases, those who casually respond to an analysis by stating that "correlation does not imply causation," as if that ends all argument, are presumptuous. In science (including social science) all knowledge is probabilistic; science (including scientific causation) is always under-determined by the empirical evidence. It is possible, but extremely unlikely, that cigarette smoking does not substantially increase the risk of lung cancer and other diseases. After all, "correlation does not imply causation." The fact is, however, that the empirical evidence in that, and many other cases, is highly suggestive of causation. Even if empirical evidence can never establish causation with 100% certainty, it remains important because it provides a basis for potential or probabilistic causation. And the greater the correlation between evidence and theory/conclusions - the greater the "confidence level" - the higher the probability of a cause-and-effect relation, especially as other potential causes are either ruled out or have much lower probabilities of significance based on the evidence.