Several readers sent me emails asking whether my paper in Skeptic (v. 11, No 4) is available online. Until now, it was not. However, now the editor of Skeptic Michael Shermer has kindly given permission to post it to the internet. While I have submitted it to Talk Reason, its managing editor has right now days off, so the paper’s placement on TR is being delayed (hopefully it will appear there not later than tomorrow). In the meantime, for those impatient to see the paper in question, I have posted it to my personal site (see here).
The <i>Skeptic</i> paper online
↗ The current version of this post is on the live site: https://pandasthumb.org/archives/2005/08/the-skeptic-pap.html
37 Comments
Moses · 18 August 2005
Salvador T. Cordova · 18 August 2005
Mark Claims:
"Dembski's definition of information is
I(E) = -log2 p(E)"
He says he got it from page 127 of Dembski's book. Well, what does it say on page 127 of Dembski's book?
"the amount of information in an arbitrary event E as I(E) = -log2p(E)"
Ahhhhh, but there is a subtle sleight of hand here by Mark Perakh. Dembski does not define Information as I(E) but as the AMOUNT of information.
For example, there is a difference between the information in file and the amount of information in the file, usually specified in bytes (8 bit = byte). For example, a text file contains information of 80,000 bits. The AMOUNT of information in the file is not the same as the information in the file. This should be plainly obvious.
Yet Perakh tries to imply that Dembski is confusing the AMOUNT of information with information itself. That is not true as can be seen by Dembski's own words.
Perakh has thus inaccurate represented Dembski, and set up a strawman argument. Now, let's see how many critics out there are going to call Perakh on his inaccurate representation???
Salvador
Salvador T. Cordova · 18 August 2005
I would like to amend my wording when I said "sleight of hand". I don't mean to imply Mark deliberately made such an obviously egregious error. He made a mistake, stated something that was not accurate about Bill Dembksi's claims.
Mark does not have to agree with Dembski's claims, but if he wishes to refute Dembski's claims, it's a good idea to state them accurately first.
However, it was a mistake nonetheless and something that negates a substantial portion of his argument.
Salvador Cordova
Steviepinhead · 18 August 2005
Hi, Sal!
As long as you're back for your next inevitable comeuppance, are you ready to talk about, um, bamboo? You know the structural grass with which multistory buildings get built?
Ready to--at long last--to start in on Lenny's very basic questions?
Oh, and how about addressing the enormous gaping holes in Dembski's logic now on diplay in the "What else to expect from Dembski thread"?
Rilke's Granddaughter · 18 August 2005
Unfortunately, Sal, it negates nothing. Dembski has no argument whatever. Would you care to deal with the fact that he has
1) never demonstrated the actual existence of CSI in anything?
2) objected to Mark's failure to deal with his mathematics when it is the applicability of the mathematics which is in question?
3) continued to ignore any and all criticism of his work in favor of sniping attacks at his critics over credentials?
As it stands, Dembski has no credibility to speak on this topic whatever: his ignorance of biology is profound (almost as great as yours) and his inability to understand that his mathematics is irrelevant to his argument is glaringly obvious.
Rilke's Granddaughter · 18 August 2005
Oh, and Sal, you might also deal with the fact that Dembski mathematics has so little credibility that no one doing actual information science pays any attention to it?
Not a really good endorsement for the "Newton" of information science.
But of course, you would have to display some actual intellectual integrity to deal with these issues. I won't hold my breath.
Russell · 18 August 2005
steve · 18 August 2005
Speaking of 'distinction without a difference', I would love to see a writeup about Dembski's moronic claim that algorithms can introduce CSI, but it's fake CSI.
steve · 18 August 2005
Russell · 18 August 2005
Sal: as the self-professed armor bearer for Sir Bill, surely you can be counted upon to provide a list of favorable reviews of Dembski's work by qualified academics.
BC · 18 August 2005
Dembski maintains that average information = entropy.
...
I see no way to reconcile Dembski's LCI with the second law of thermodynamics.
Isn't entropy the opposite of average information - i.e. entropy is a measure of randomness or chaos? So, entropy increases as information decreases and vice-versa. I'm unsure here whether Perakh is confused about what Dembski thinks, if he (Perakh) is confused about what entropy means, or both. Perakh seems perfectly happy with the idea that entropy is information, so I think perhaps he is confused about what entropy means. Given that fact, Perakh's claim that, "I see no way to reconcile Dembki's LCI with the second law of thermodynamics." is false if you say that entropy is the opposite of information. That isn't to say that Dembski is right. I'm saying that Perakh's critism of Dembski on this issue is a result of a misunderstanding.
Stephen Erickson · 18 August 2005
Russell · 18 August 2005
Stephen Erickson · 18 August 2005
Stephen Erickson · 18 August 2005
To me it seems the crux of the problem is this:
Information and specification are somewhat opposite concepts. Information refers to a lack of uniformity, while specification refers to limitations. Yet, as Perakh quite convincingly shows, Dembski boils down both concepts to issues of improbability.
Somebody correct me if I'm wrong.
Stephen Erickson · 18 August 2005
Just a quick aside:
How does Miller-Urey fold into Dembski's arguments? Do the results fall out of his explanatory filter? Does the basic result (simple molecules + energy -> complex molecules) contradict his mathematical pie-in-sky?
snaxalotl · 18 August 2005
it's pretty common to use the expression "X" to represent "metric of X". Barring some error that can be pointed out in formal calculations, this is no worse than the possibility that someone will confuse the phrase "one kilogram of steel" with one kilogram of steel
BC · 18 August 2005
In other words, implicitness or explicitness of that negative sign does not affect the argument.
But it does because when he says that "in a closed system entropy cannot spontaneously decrease", is he talking about entropy=information (which is how he's been talking about entropy in the paper, but isn't a correct statement about the second law of thermodynamics), or is he talking about entropy=disorder (which is the only way that statement is true)?
When the second law of thermodynamics states that the total entropy of an isolated system can never decrease, it is saying that the total disorder of an isolated system can never decrease, which is saying that the total order of an isolated system can never increase, which is saying that the total information of an isolated system can never increase. It sounds to me like Dembski is pretty much just restating the second law of thermodynamics with different words. I don't quite understand how Perakh is saying Dembski's law is in opposition to the second law of thermodynamics. The area where Dembski is wrong is in claiming that information much always be decreasing, even in subsections of a system. In reality, one area can increase in information/decrease in entropy (earth) if another is offseting it (the sun is losing energy).
Salvador T. Cordova · 18 August 2005
steve · 18 August 2005
Sir_Toejam · 18 August 2005
"... abstractly defined communication channel. Neither does it give one license to equivocate the meaning of "information measure" with CSI, as Mark has done."
er, you amaze me. don't you see that the creation of an abstract definition that isn't based on any realistic or supported basic assumptions is EXACTLY what Dembski has done?
Besides which, you can't use your current missives to try and claim innocence to the reasons why the eptithets listed are rather appropriately hurled in your direction. anybody can simply look back at your unnending and mindless support of someone who has so often been proven wrong, has lied (well documented and you know it), and rarely if ever actually addresses legitimate criticisms of his "work" to see where the epithets came from to begin with.
do you now deny your past behavior? I have not seen the epithets used loosely, even around PT. are you even able to look at yourself and see how odd your behavior is?
you still make the best poster boy i could imagine to best describe the incomprehensible behavior exhibited by those who claim to be ID "supporters". you constantly end up being the best argument against ID ever being viewed by real scientists as anything but crankish dogma used to mask the ulterior desires of the terminably delusional.
keep it up, i guess, tho I do pity you like i did JAD.
steve · 18 August 2005
Jim Anderson · 19 August 2005
steve, I think you misunderstood Salvador (as do we all, mostly due to his battles with his own communicative entropy). When he said "reconsider calling Electrical Engineers...," he meant "think about broadening your epithets to include Electrical Engineers"--an attempt at a rejoinder, not a claim about past action.
Mark Perakh · 19 August 2005
Re: comment 43869 by BC. The statement "average information = entropy" is found in Shannon's seminal paper that started information theory and Dembski has taken it directly from Shannon. It is a commonly accepted definition that directly follows from Shannon's formula for entropy which is in fact equivalent to Boltzmann-Gibbs equation except for a constant.
Since the comment's writer using the moniker BC seems to be a little uncertain regarding this point, perhaps he (she) would be interested to look up this where some of the related points are discussed.
Moses · 19 August 2005
Mr. Cordova,
I have noticed that other forms of creationism, such as Young Earth and Flat Earth have testable hypothesis. Sure, they were proven wrong, but hypothesis none the less.
With Young Earth, the earth was between 6,000 & 10,000 years old. A little bit of radiometric dating, and "poof" that was thrown in the rubbish bin.
With Flat Earth, just some basics of geometry. All we had to do was remember what Pythogoras figured out 2,500 years ago. And if we didn't remember that, Ptolemy's 8-volume Geometry of the World clearly indicated a curved (round) earth. Workig off those works, in 1492, Columbus proved, to most, that the Earth was round. And, if that wasn't enough, we had Magellan's expedition circumnavigate the globe in the early 1500's.
But why is it that Intelligent Design predicts nothing and formulates no testable hypothesis? The very basics of any scientific theory, as I have been taught many, many, many times, that a scientific theory has to predict something that is testable. And if it doesn't, it's not science, but philosophy.
So, why are you not spending your time working productively on elevating "Intelligent Design" from philosophy to science instead of trolling message boards? Your behavior is unproductive to the advancement of your theory and smacks of missionary work.
What does Intelligent Design predict and how do we test it? Can you answer that? Can you even point to some time in the near future that someone might actually do that very basic act of science?
Because until ID has a theory and a testable hypothesis, it is just a philosophy. And whether you dress that pig up in circular mathmatical arguments or continue with the pernacious mis-represention of the Theory and Fact of Evolution while trolling message boards or going to religious conferences, you've still got a pig.
Russell · 19 August 2005
Oops. Mea culpa. Having reviewed Dr. Perakh's critique, I see that the positive/negative sign in the entropy=information statement is, indeed, important to the argument.
Apologies to BC and Salvador Cordova. I should either read before I write or stay out of math discussions altogether.
But I probably won't. Sal, how's that list of positive academic reviews of Dembski's oeuvre coming along?
steve · 21 August 2005
wildlifer · 23 August 2005
Anyone notice how Salvador's such a fucking beggar, that he's always asking for "charitable" interpretations of his hero's blather?
Pole · 25 August 2005
As it stands, Dembski has no credibility to speak on this topic whatever: his ignorance of biology is profound (almost as great as yours) and his inability to understand that his mathematics is irrelevant to his argument is glaringly obvious.
Interested Reader · 25 August 2005
Wilderlifer,
why do you let your feelings get hurt so easily?
Dry those eyes and maybe you'll be able to make better sense of what you're reading.
Interested Reader · 25 August 2005
Perakh,
could you please respond to Sal's critique.
I liked your book very much and your on-line articles. But I think Sal has a point. Could you please explain your intial point or at least clarify.
Thank you.
Mark Perakh · 25 August 2005
Dear "Interested Reader":
Given the very large number of comments to many threads on PT, I usually do not read all of them, more so because many of them are off-topic. Among the comments in this thread I have not read are those by Salvador Cordova. From the previous experience I knew there was a very slim chance anything he said would justify time spent on reading his rants.
Since I let Cordova's comments pass without reading, naturally I did not respond to them. In my view the usual lack of substance in Cordova's comments speaks for itself anyway. Another reason for not engaging in answering Cordova's comments is the expectation that regardless of how well substantiated a rebuttal of his comments may be, he most probably will post in response more of verbose rants, prolonging an unnecessary discussion, which seems to be his passion.
However, since you request that I clarify the matter, I looked up Cordova's comments in the initial part of this thread. As expected, his comment wherein he claimed to have found an error in my post turned out to be preposterous. Quoting Dembski, I reproduced Dembski's definition of information as I=-log (P) where log is to the base of 2. In Cordova's opinion I committed a grave mistake (either as a deliberate "sleight of hand" or as an inadvertent mistake) because in Dembski's book the quantity I is referred to not as "information" but as an "amount of information," which, asserts Cordova, is a substantial difference. In a subsequent comment, Cordova referred to his original comment in question, asserting that I "mangled" this matter.
Such comments make one shrug.
Just a few examples.
Look up the article "Information Theory" in the Van Nostrand's Scientific Encyclopedia. It was written by Professor of Purdue University, a renowned expert in Information Theory George R. Copper. On page 1355 (fifth edition) Cooper gives the same expression I=-log(P). Nowhere does Cooper use the expression "amount of information." When first introducing this quantity, Cooper refers to it as "informational contents." Continuing, Cooper refers to that quantity as simply "information," (as in the expression "units of information"). Perhaps Cordova should repudiate Cooper for an improper use of terms --- given Cordova's amusing self-confidence, such a repudiation would be in line with his ridiculous "critique" of Elsberry-Shallit's paper and of my essay that is referred to in this thread.
Look up the well known standard textbook on information theory by Richard E. Blahut. ("Principles and Practice of Information Theory,"Addison-Wesley, 1990 edition). On page 55 Blahut introduces the same formula I=-log (P) and refers to it as "amount of information." However just two lines further down the page he refers to the same quantity as simply "information." These two terms are interchangeable without causing any confusion (except in Cordova' mind?).
In the textbook on information theory by Robert M. Gray ("Entropy and Information Theory," 1991 edition) the expression "amount of information" is not used, while the same quantity I is referred to as "self-information." In many other papers and books, too numerous to be listed here, the same quantity is often referred to as "surprisal." Hence, while the term "amount of information" is legitimate, it is not the only choice of a term for the quantity I. It can be referred to as simply "information" equally legitimately if it is interpreted in a quantitative sense. In my essay it has been stated directly in the text that the term "information" was indeed used in a quantitative sense, i.e. tantamount to "amount of information." The "error" in using the term "information" instead of "amount of information" exists only in Cordova's imagination and has perhaps been caused by Cordova's overarching need to find errors in any critique of his master Dembski, at any cost, even where there is none.
On the other hand, just a couple of pages after introducing the "amount of information" I, Dembski refers to the same I as ... "complexity"! But of course, according to Cordova, Dembski's inconsistencies should always be approached "charitably" and Dembski, if we follow Cordova's charitable approach, never commits errors.
Having read these two comments by Cordova, I did not feel I needed to read the rest of his rants. I have no intention to curtail in any way Cordova's freedom to post here any comments of his choice --- let him expose himself to readers. I have no intention, either, to engage in any further discussion of Cordova's comments. If you, "Interested Reader," insist on some additional clarification, please email to me personally outside this thread which has become overloaded with comments and with every passing week attracts less public attention anyway. Cheers, Mark Perakh
Salvador T. Cordova · 26 August 2005
ts (not Tim) · 26 August 2005
ts (not Tim) · 26 August 2005
BTW, "average information" gets 24,500 google hits (a bit more that Dembski got for his alleged Schopenhauer quote).
SEF · 26 August 2005
Paul Flocken · 26 August 2005
Salvador,
I've been jumping around between here(PT), ARN, and UncommonDescent to read everything you have written in the last two weeks. I think I can almost wrap my brain around this idea but I did not know the last question I needed to ask to complete the understanding. With what you wrote above you have given me that.
Your two strings of coins above:
"H H H H H T T T T T" = "T T H H H T H T H T";
have identical amounts of information present but the information is different. I see that. Using your shoebox example the first string could have been laid down by a human being in a row across the centerline of the box and the second string is what happens when the box is shaken violently. What I want to know is how do you quantify the actual information in the two strings as opposed to just the amount of information? What do you do that distinguishes the two strings? Is this where specification comes in? And is this what Dembski's big breakthrough is about. Can he actually distinguish between the two strings with his mathematics? There is a quote from some scientist about how until you can put a number on something you don't really know anything about it, so I pretty much don't think one has something unless it can be described with numbers. To me numbers take the subjectivity out of existence and put the objectivity into it.
Instead of using random coin flippings let's use a real world example and take this jpg of a bacterial flagellum and some secretory systems:
http://www.pandasthumb.org/archives/images/Archea_flag.jpg
One of the proteins in that picture is PilQ*. The amino acid sequence for it is:
MLEESAVTRGKWMLAAAWAVVLVGARVHGAELNTLRGLDVSRTGSGAQVV50
VTGTRPPTFTVFRLSGPERLVVDLSSADATGIKGHHEGSGPVSGVVASQF100
SDQRASVGRVLLALDKASQYDVRADGNRVVISVDGTSQSVDAKRAETPAR150
TERMTASVEAKPHPVAAQAPAKVVKAESAAVPKAALPENVVAAEADEREV200
SNPAQHITAMSFADDTLSIRADGDIARYEVLELADPPRLAVDLFGVGLAT250
RAPRVKSGALRDVRVGAHADKVRLVLDVRGTMPAYRVDRANRGLEVVLGR300
AVARTWRRPLRPRAVVASVAEVEPLRQTPVKSDASPVVEVKDVRFEESSS350
GGRIVMKLSGTSGWKVDRPDPRSAVLTLDNARLPKKFERSLDTSALDTPV400
KMISAFSVPGAGGKVRLVVAADGAIEEKVSQSAGTLSWRLDVKGVKTEEV450
AVAQRTAGFTTEAPAYAAEGAPQQARYRGKRVSFEFKDIDIQNLLRVIAE500
ISKKNIVVADDVSGKVTIRLRNVPWDQALDLVLRTKALGKEEFGNIIRIA550
PLKTLEEEARLRQERKKSLQQQEDLMVNLLPVNYAVAADMAARVKDVLSE600
RGSVTVDQRTNVLIVKDVRSNTERARSLVRSLDTQTPQVLIESRIVEANT650
SFSRSLGVQWGGQARAGQATGNSTGLIFPNNLAVTGGVTGTGAGLPDNPN700
FAVNLPTGTGQGVGGAMGFTFGSAGGALQLNLRLSAAENEGSVKTISAPK750
VTTLDNNTARINQGVSIPFSQTSAQGVNTTFVEARLSLEVTPHITQDGSV800
LMSINASNNQPDPSSTGANGQPSIQRKEANTQVLVKDGDTTVIGGIYVRR850
GATQVNSVPFLSRIPVLGLLFKNNSETDTRQELLIFITPRILNRQTIAQT900
L901 (I presume that is standard FASTA format but I didn't have time to go over the whole website to doublecheck that is what they were using)
The DNA sequence which codes for that amino acid sequence is:
ATG CTA GAA GAG AGC GCT GTG ACA CGC GGA AAA TGG ATG TTA GCA 15
GCT GCC TGG GCG GTT GTC CTC GTC GGA GCG CGA GTG CAC GGG GCA 30
GAA CTG AAC ACG CTT AGG GGC TTG GAC GTA AGT AGA ACC GGC TCA 45
GGT GCC CAA GTA GTT GTT ACT GGA ACC CGA CCG CCA ACA TTT ACG 60
GTA TTC AGA CTC TCG GGA CCC GAG AGG CTG GTG GTC GAC CTA TCT 75
AGC GCC GAT GCA ACA GGC ATA AAA GGC CAC CAT GAA GGG AGT GGT 90
CCT GTC TCC GGG GTG GTA GCG TCA CAA TTC TCC GAC CAA CGT GCT 105
AGT GTG GGG AGG GTG CTC CTT GCA CTA GAT AAA GCT AGT CAG TAC 120
GAT GTT AGG GCC GAC GGA AAC CGC GTA GTT ATA TCG GTC GAC GGC 135
ACG TCT CAG TCA GTG GAC GCG AAA AGA GCA GAG ACC CCT GCT CGA 150
ACA GAG AGA ATG ACT GCT AGC GTT GAG GCC AAG CCA CAC CCG GTC 165
GCT GCC CAA GCA CCA GCC AAA GTG GTA AAG GCG GAA AGC GCA GCG 180
GTC CCC AAG GCC GCA CTG CCC GAG AAT GTA GTC GCG GCA GAA GCG 195
GAT GAA CGG GAA GTA TCC AAT CCA GCA CAG CAT ATT ACA GCC ATG 210
AGT TTT GCG GAC GAT ACT CTA TCA ATA CGG GCT GAT GGT GAT ATC 225
GCC CGA TAT GAG GTA TTG GAA CTA GCG GAT CCC CCT AGG CTT GCG 240
GTA GAC TTG TTC GGG GTG GGA CTC GCA ACC CGT GCA CCC CGA GTC 255
AAG TCT GGT GCC TTA CGC GAC GTT CGC GTG GGC GCT CAC GCT GAC 270
AAG GTA AGG CTG GTG CTC GAC GTA CGA GGA ACA ATG CCG GCA TAC 285
AGA GTC GAC CGC GCA AAC CGT GGC CTA GAG GTT GTG TTA GGG AGA 300
GCC GTT GCT AGG ACC TGG AGA CGG CCA CTG CGG CCA AGG GCT GTC 315
GTT GCG AGC GTT GCC GAA GTC GAA CCC CTT CGT CAA ACG CCT GTG 330
AAA TCG GAT GCG TCA CCG GTA GTC GAG GTA AAA GAT GTC AGA TTC 345
GAG GAA AGT AGC TCC GGT GGG AGA ATC GTA ATG AAA CTC TCT GGC 360
ACG AGT GGA TGG AAA GTA GAC CGT CCA GAT CCC CGG TCG GCC GTT 375
CTC ACG TTG GAC AAC GCC CGA CTG CCG AAG AAA TTT GAA AGA AGT 390
CTG GAC ACC TCA GCC CTT GAT ACA CCA GTC AAG ATG ATC TCC GCT 405
TTT TCT GTG CCT GGC GCT GGG GGT AAG GTA CGA CTT GTT GTC GCG 420
GCT GAT GGG GCC ATA GAG GAA AAG GTG AGC CAA TCA GCC GGA ACT 435
TTG TCC TGG CGC CTA GAC GTC AAG GGC GTC AAA ACT GAG GAA GTT 450
GCT GTT GCG CAG CGT ACA GCG GGT TTT ACC ACG GAA GCA CCG GCG 465
TAT GCC GCT GAG GGG GCA CCC CAA CAG GCA AGA TAC CGC GGA AAA 480
CGC GTA AGC TTC GAA TTC AAG GAC ATC GAT ATT CAG AAT CTA TTA 495
AGG GTA ATT GCA GAG ATT TCG AAG AAA AAC ATA GTA GTG GCA GAC 510
GAT GTG AGC GGC AAA GTC ACC ATA AGG CTT CGG AAT GTT CCT TGG 525
GAC CAA GCG CTG GAT CTC GTG TTA CGA ACA AAG GCG CTA GGA AAA 540
GAA GAG TTC GGT AAC ATT ATC AGG ATA GCA CCA TTG AAA ACT CTG 555
GAA GAG GAA GCT AGG TTG CGT CAG GAA CGA AAG AAA AGT CTG CAG 570
CAA CAG GAA GAC CTT ATG GTG AAC TTA CTT CCC GTA AAT TAC GCG 585
GTA GCT GCG GAT ATG GCT GCG CGC GTC AAG GAC GTC CTG TCC GAG 600
CGG GGC AGC GTT ACC GTG GAT CAA AGA ACT AAC GTG TTA ATC GTT 615
AAA GAC GTA AGG TCC AAT ACT GAA CGA GCA CGT AGC CTA GTT AGA 630
TCT TTA GAC ACC CAG ACA CCT CAG GTG CTG ATA GAG TCG CGG ATT 645
GTG GAA GCT AAC ACC TCT TTT AGT CGC TCA CTA GGG GTA CAA TGG 660
GGG GGT CAA GCG AGG GCG GGA CAA GCA ACC GGC AAT AGC ACA GGC 675
CTT ATA TTT CCA AAC AAT TTG GCC GTT ACT GGC GGT GTC ACA GGA 690
ACA GGA GCC GGA CTA CCT GAT AAC CCA AAC TTC GCA GTT AAT TTA 705
CCC ACC GGG ACG GGC CAG GGT GTA GGA GGT GCT ATG GGG TTC ACC 720
TTT GGG AGT GCA GGG GGA GCA CTC CAG CTT AAC CTC CGA TTG TCG 735
GCA GCC GAA AAC GAG GGC TCC GTC AAG ACG ATA TCA GCC CCG AAA 750
GTA ACA ACT CTC GAT AAT AAC ACG GCC CGC ATC AAT CAA GGT GTC 765
TCG ATC CCG TTC AGC CAA ACT AGT GCC CAG GGA GTG AAT ACG ACA 780
TTC GTA GAG GCG AGA CTA TCT CTC GAG GTT ACG CCC CAC ATT ACG 795
CAA GAC GGT TCA GTC TTA ATG AGC ATT AAC GCA AGC AAC AAT CAG 810
CCA GAT CCG TCG AGT ACG GGA GCT AAT GGG CAA CCC TCT ATA CAA 825
AGG AAA GAA GCC AAC ACC CAG GTT CTC GTG AAA GAT GGC GAC ACA 840
ACT GTC ATA GGG GGT ATA TAC GTG CGC CGT GGC GCA ACC CAA GTA 855
AAC TCC GTC CCA TTC TTG AGT CGG ATT CCC GTA CTT GGA CTA CTG 870
TTT AAG AAC AAT TCA GAG ACA GAC ACA AGA CAG GAA CTG CTC ATT 885
TTC ATC ACT CCT CGA ATC CTA AAT AGA CAG ACG ATC GCG CAA ACC 900
CTT901
Can you show me what the math Dembski uses has to say about these strings? I have other questions but I think this is all I need to get started.
By the way I have ordered NFL from my bookstore and will read it as soon as it arrives. Is it sufficient and stand-alone all by itself or should I read any other of his books first or afterwards.
Sincerely,
Paul
*I chose this protein because I was able to find it quickly. Sorry it is so long.