About Rationally Speaking
Rationally Speaking is a blog maintained by Prof. Massimo Pigliucci, a philosopher at the City University of New York. The blog reflects the Enlightenment figure Marquis de Condorcet's idea of what a public intellectual (yes, we know, that's such a bad word) ought to be: someone who devotes himself to "the tracking down of prejudices in the hiding places where priests, the schools, the government, and all long-established institutions had gathered and protected them." You're welcome. Please notice that the contents of this blog can be reprinted under the standard Creative Commons license.
Thursday, May 31, 2012
Book Review: Too Big to Know
by Greg Linster
How do you know what you think you know? What counts as knowledge and what doesn’t? These questions speak to a great semantics-based problem, i.e., trying to define what ‘knowledge’ is. Studying the nature of knowledge falls within the domain of a branch of philosophy called epistemology, which happens largely to be the subject matter of David Weinberger’s book Too Big Too Know.
According to Weinberger, most of us tend to think that there are certain individuals — called experts — who are knowledgeable about a certain topic and actually possess knowledge of it. Their knowledge and expertise is thought to be derived from their ability to correctly interpret facts, often through some theoretical lens. Today, like facts, experts too have become ubiquitous. It seems we are actually drowning in a world with too many experts and too many facts, or at least an inability to pick out the true experts and the important facts.
Most of us are appalled, for instance, when we hear the facts about how many people are living in poverty in the United States. However, these facts can be misleading and most people don’t have enough time to think critically about the facts that are hurled at them every day. There might in fact be “X” amount of people living in poverty in the United States, but did you know that someone with a net-worth north of one million dollars can technically be living in poverty? How the government defines poverty is very different than the connotation that many of us have of that word. The amount of income you have is the sole factor used to determine if one is “living in poverty,” but this bit of information seldom accompanies the facts about how many people are “living in poverty.”
I recently posed a question on Facebook asking my subscribers if a fact could be false. To my surprise, there was much disagreement over this seemingly simple question. Weinberger reminds us that facts were once thought to be the antidote to disagreement, but it seems that the more facts are available to us, the more disagreements we seem to have, even if they are meta-factual.
It’s unquestionable that today’s digitally literate class of people have more facts at their fingertips than they know what to do with. Is this, however, leading us any closer to Truth? Well, not necessarily. This is because not all facts are created equal, and not all facts are necessarily true. Facts are statements about objective reality that we believe are true. However, while a fact can be false, truth is such regardless of our interpretation of it — we can know facts, but we can’t necessarily know Truth.
In the book, Weinberger draws an important distinction between classic facts and networked facts. The late U.S. Senator Daniel Patrick Moynihan famously said: “Everyone is entitled to his own opinions, but not to his own facts.” What he meant by that was that facts (what Weinberger calls classic facts) were thought to give us a way of settling our disagreements. Networked facts, however, open up into a network of disagreement depending on the context in which they are interpreted. “We have more facts than ever before,” writes Weinberger, “so we can see more convincingly than ever before that facts are not doing the job we hired them for.” This seems to be true even amongst people who use a similar framework and methodology for arriving at their beliefs (e.g., scientists).
One of Weinberger’s central arguments is that the Digital Revolution has allowed us to create a new understanding of what knowledge is and where it resides. Essentially, he claims that the age of experts is over, the facts are no longer the facts (in the classical sense), and knowledge actually resides in our networks. While this is an interesting idea, I’m not sure it’s entirely true.
Knowledge is a strange thing since it depends on the human mind in order to exist. I have a stack of books sitting on my desk, but I don’t point to them and say there is a stack of knowledge sitting on my desk. I simply don’t yet know if there is any knowledge to be gleaned from those books. For this reason, I don’t think knowledge can exist on networks either. Knowledge requires human cognition in order to exist, which means that it only exists in experience, thus giving it this strange ephemeral characteristic. I cannot unload my knowledge and store it anywhere, then retrieve it at a later date. It simply ceases to exist outside of my ability to cognize it.
Knowledge, Weinberger argues, exists in the networks we create, free of cultural and theoretical interpretations. It seems that he is expanding on an idea from Marshall McLuhan, who famously said, “The medium is the message.” Is it possible, then, that knowledge is the medium? The way I interpret his argument, Weinberger seems to be claiming that the medium also shapes what counts as knowledge. Or, as he himself puts it, “transform the medium by which we develop, preserve, and communicate knowledge, and we transform knowledge.” This definition of knowledge is, however, problematic if one agrees that knowledge can only exist in the mind of a human (or comparable) being. To imply that a unified body of knowledge exists “out there” in some objective way and that human cognition isn’t necessary for it to exist undermines any value the term has historically had. Ultimately, I don’t agree with Weinberger’s McLuhanesque interpretation that knowledge has this protean characteristic.
In a recent essay in The Atlantic Nicholas Carr posed the question: “Is Google Making Us Stupid?” His inquiry spawned a fury of questions pertaining to our intelligence and the Net. Although Weinberger has high hopes for what the Net can do for us, he isn’t necessarily overly optimistic either. In fact, he claims that it’s “incontestable that this is a great time to be stupid” too. The debate over whether the Internet makes us smarter or dumber seems silly to me, though. I cannot help but conclude that it makes some people smarter and some people dumber — it all depends on how it is used. Most of us (myself included) naturally like to conjugate in our digital echo chambers and rant about things we think we know (I suspect this is why my provocative “Who Wants to Maintain Clocks?” essay stirred up some controversy — most RS readers don’t usually hear these things in their echo chambers).
Weinberger also argues that having too much information isn’t a problem, but actually a good thing. Again, I disagree. In support of this claim, he piggybacks off of Clay Shirky, who tells us that the ills of information overload are simply filtering problems. I, however, don’t see filtering as a panacea because filtering still requires the valuable commodity of time. At some point, we have to spend more time filtering than we do learning. An aphorism by Nassim Taleb comes to mind: “To bankrupt a fool, give him information.”
Overall, Weinberger does a nice job of discussing the nature of knowledge in the Digital Age, even though I disagree with one of his main points that knowledge exists in a new networked milieu. The book is excellent in the sense that it encourages us to think deeply about the messy nature of epistemology — yes, that’s an opinion and not a fact!
Labels:
book review,
David Weinberger,
epistemology,
internet,
knowledge
Subscribe to:
Post Comments (Atom)
Greg,
ReplyDeleteRe: “This is because not all facts are created equal, and not all facts are necessarily true. Facts are statements about objective reality that we believe are true. However, while a fact can be false, truth is such regardless of our interpretation of it — we can know facts, but we can’t necessarily know Truth.”
Truth, as commonly conceived, is a relationship between language and an extra-linguistic reality. Thus, 'Belfast is in Northern Ireland' is true because of certain objective social and geographical arrangements that exist in the British Isles.
Given this, we can crudely identify two characteristics to truth: truth-bearers and truth-makers. Certain linguistic entities (e.g. propositions) are truth-bearers- that is, propositions are either true or false (sometimes both & sometimes neither). Truth-makers, naturally, are those extra-linguistic things that make truth-bearers either true or false. So, e.g., we can identify facts, state of affairs, or physical arrangements as truth-makers.
In this light, the assertion that facts can be false is nonsensical: Facts are not truth-bearers, but rather truth-makers.
Lastly, I have no idea what 'Truth' is such that it can or cannot be known, consider to know something is to have (at least) a justified true belief.
"Certain linguistic entities (e.g. propositions) are truth-bearers- that is, propositions are either true or false (sometimes both & sometimes neither)."
ReplyDeleteHave you ever heard of degrees of probability? Otherwise you've been rather badly describing truth as tautological.
Probabilities are not truth-functional, which is to say nothing more than probability assignments do not assign degrees of truth to propositions, but rather they assign likelihoods of propositions being true (or events occurring).
ReplyDeleteIf we want to talk of degrees of truth, we have to turn to fuzzy logics.
I wonder how my description of truth is tautologous? Some proposition (or some other preferred linguistic entity) is true if and only if some truth-maker (facts, etc.) makes it true. So, e.g., 'Belfast is in Northern Ireland' is true because Belfast is in Northern Ireland. This is neither tautologous nor circular since 'Belfast is in Northern Ireland' identifies a proposition and Belfast is in Northern Ireland identifies a certain sociological and geographical state of affairs.
'Belfast is in Northern Ireland' is true because Belfast is in Northern Ireland.'
ReplyDeleteSorry, but that's a perfect example of tautology:
• a phrase or expression in which the same thing is said twice in different words.
• Logic a statement that is true by necessity or by virtue of its logical form.
And as to degrees of truth not being probabilistic but fuzzy logic, fuzzy logic is a form of many-valued logic or probabilistic logic.
Roy,
DeleteRe: Fuzzy logic and probabilities.
Allow me to add this clarification: In simple terms, fuzzy logics are concerned with set inclusion, and the assignment of real numbers in [0,1], inclusive, to the degree to which an object can be included within a set (1 being assigned if an object is entirely included within a set and 0 assigned if an object is entirely excluded from a set). Probabilities are about the long-term proportion with which an event will occur in situations with short-term uncertainty (and/or with degrees of belief if one is a subjectivist).
Roy,
ReplyDeleteRe: Fuzzy logics
My doctoral research is in monoidal t-norm based fuzzy logics (in particular concerning issues relating to decidability). Probability theory and fuzzy logics are not equivalent. Though fuzzy truth values and probability assignments extend over real numbers in [0,1], the former are truth functional- that is, the value of a logical compound (connecting propositional parameters, or variables, via logical connectives) is determined by the values of its parts (the truth values of the variables), whilst probabilities are not. So, e.g., given a coin, let H be 'heads' and T be 'tails'. If P(H) is the probability of H, P(H&H) = P(H) = .5, but P(H&T) = 0, even though P(H) = P(T).
Re: Truth
When logicians and philosophers of language discuss truth we often employ a distinction known as the meta-object language distinction. An object language is, roughly, a language with a well-defined vocabulary and grammar, such as English. The sentences in the object language normally pick out certain extra-linguistic things, such as physical arrangements or facts (e.g. Blue jays, the color blue) and relations between these things (e.g. Blue jays are blue), whilst the meta-language is a formal language distinct from the object language that is used to talk about the object language (e.g. a meta-linguistic sentence would be 'Blue jays are blue').
In nuce, a sentence surrounded by ' ' refers to the object language sentence whilst the object language sentence refers to the physical (or non-physical) things it refers to.
Eamon,
ReplyDeleteIf your initial purpose in dissecting this post in terms of the (allegedly) finer points of academic logistics was to clarify any part of the discourse, you failed. Not only as to the tautologies involved but in your lack of understanding that probability is the closest we can ever get to truth.
You wrote: "If we want to talk of degrees of truth, we have to turn to fuzzy logics."
No, we don't "have" to do that at all, since in the end we are still dealing with degrees of probability.
Definition:
probability |ˌpräbəˈbilətē|
noun ( pl. -ties)
the extent to which something is probable; the likelihood of something happening or being the case : the rain will make the probability of their arrival even greater.
• a probable event : for a time, revolution was a strong probability.
• the most probable thing : the probability is that it will be phased in over a number of years.
• Mathematics the extent to which an event is likely to occur, measured by the ratio of the favorable cases to the whole number of cases possible : the area under the curve represents probability | a probability of 0.5.
Your definition: "Probabilities are about the long-term proportion with which an event will occur in situations with short-term uncertainty (and/or with degrees of belief if one is a subjectivist)."
Mathematical probabilities are not equivalent to intuitively predictive forms. We don't think mathematically. (Although you may think exceptionally that you do.)
Read this: Interpretations of Probability
Deletehttp://plato.stanford.edu/entries/probability-interpret/
I have decided not to read your review out of fear it might overload my brain. But thanks for the RCS headline.
ReplyDeleteRoy,
ReplyDeleteRe: my "lack of understanding that probability is the closest we can ever get to truth."
I discussed how logicians et al. formally understand & analyze the concept of truth. I made no mention-- at all; in no way, shape or form -- of having or not having indubitable or apodeitctic knowledge. (FYI: I do believe we can have certain knowledge: E.g., I am certain that, if A, then B, and given A, then B.)
So, Roy, what does it mean for some sentence to be true?
Again, your 'certain knowledge' is mathematically tautological, i.e., true by necessity or by virtue of its logical form.
DeleteThere is otherwise no certain knowledge of the truth. The best degree of reliability that suits our particular purposes is the most that we can probabilistically hope for.
I agree with everything you just wrote.
DeleteAnd yet as I write this I know with an absolute certainty that I am not yet definably dead.
DeleteI could summarize this in one sentence:
ReplyDelete"facts are facts regardless of how you feel about them".
As to the nonsense at the end of the article about the sheer volume of knowledge and filtering issues I definitely disagree with the author. There IS no such thing as too much knowledge. The proper amount of knowledge to determine what is factual is however much it takes. Thus filtering. Thinking is not done for you. You must do it yourself. Rational people will recognize actual facts given the same data.
This is not to say that everything can be broken out into a series of facts. Actual science (the repeatable, quantifiable kind) lends itself to summarization into a bunch of facts. These facts will be the same for all investigators. Non-science won't work like that. If the "facts" are personally interpretable in an area then they aren't actual facts. There is nothing wrong with this. Some things are supposed to be personal.
More interesting in my opinion is the confluence of a wider availability of information coupled with a more concerted effort to mislead than we have ever experienced as a species.
ReplyDeleteI begin to think that one is necessarily driven from the other.
The ability to transfer and share massive amounts of data should be expanding our understanding, as we can more easily distribute both the process for arriving at a conclusion, as well as the underlying data, than ever before.
We should be realizing the same sort of benefit that businesses did in the last 20 years in terms of productivity and efficiency. There was a component of increased communication from the use of networks, but it was coupled with increased accountability. In that there was a heightened responsibility to share the data to support your recommendation, or conclusion, along with a reviewable record of the event.
The net effect was an ongoing process of loose peer review that weeded out those who weren't using data effectively, or weren't reliable in either their selection or interpretation of that data.
Perhaps that process is working its way out in some tectonic fashion as we speak. Were Time, Newsweek, CBS News, CNN and the NY Times as consistently flawed in their process as they appear now? Or have they doubled down on their efforts in recent years as information has become more accessible?
Would the problems with climate data and the flawed processes driving climate research have gone undiscovered given the small number of gatekeepers in a pre-digital era?
The genie is out of the bottle. We can start with the assumption that there is an over-abundance of data and that finding fact is merely a function of filtering, but it is much more likely that we are also dealing with the deliberate effort to promote the idea that we are incapable of individually discerning fact, just as so much data has become available.
Knowledge is the product of an ongoing process of refinement from observations by hypotheses, by any reasoning (eg probability; and to observations by deduction by strict logic, to test the hypothese. Knowledge is never perfect or true, but the process towards it might be (near enough).
ReplyDeleteI read that Lakatos considered Darwinism non-scientific as it does not predict. I would say it hypothesizes from the evidence that nature has selected over eons in particular ways and therefore will continue to do so, but we are unable to say what any of the particular ways will be, only that they will be selected in some way.
It is weakly scientific. It provides no forward moving basis for evolution except under a general principle of "survival of the selected" (which, by the way, is not a tautology when fittest is substituted for selection, but is a proposition based on Darwin's discovered relation between the organism and environment). That is a very very weak predicition, being only under the most general, albeit correct, principle.
This problem can be corrected by taking a massive forward step, theoretically and metaphorically, by saying that DNA mutates and constructs using available chemicals of the environment, and our anatomy, for example, is the literal chemical embodiment of the environment, or an aspect of it, and likewise other species. Predict from chemical construction using a specific landscape of chemicals there to build with, and you have strong scientific predicition.
PS. Convergence provides a vague accessory principle to selection, extending it to say there are standard environmental and anatomical matches that survive better due to selection, thus Dawkins' "different trades". Convergence is consistent with evolution driven by chemical construction using environemental chemicals. My theory, being driven from below and also selected, would drive (create) particular anatomies to standard structures that are also selected as their "final filter", and thus strongly predictable.
ReplyDelete