Sunday, May 22, 2011

Majority Text: (X): Carson's Objections Reexamined

Lets look at D.A. Carson's first objection to the Majority Text Probability Model:

He suggests historical factors skewed the results, supposedly making minority readings (errors/edits) become majority readings (overcoming the Alexandrian and other text-types).  What mechanical model does he offer?

He cites:
(1) the influence of Chrysostom, 
(2) the restriction and displacement of the Greek language.   
Because of this he argues, the Byzantine text probably doesn't represent the original text.

 Chrysostom comes onto the scene too late to explain the creation and existence of the Byzantine text.  Born in 349 A.D., he was not even baptized until 368 or 373.  He lived as a hermit from 375-377.   He did not become a deacon until 381, and became a priest only in 386.   Between 386-398 he became popular as a speaker/commentator, apparently in Antioch.   He was ordained archbishop of Constantinople in 398.   Because of his attempted reforms he was attacked and banished in 403, but reinstated.  He was banished again to Georgia but died on the way in 407.  

Chrysostom used the Byzantine text, but this was already the popular text in Constantinople by this time.  He does not seem responsible for popularizing it himself.  Jerome had used the Byzantine text for his Vulgate translation in 392, and had judged this text to be older than the three current recensions of  
(a)  Lucian          (Antioch, 270-310)    
(b)  Hesychius    (Egypt, c. 320-350?)
(c)  Origen         (Caesarea, c. 200).
 Even granting the popularity of Chrysostom's writings long after his death, this alone is simply not enough to cause or explain the dominance of the Byzantine text-type in the Eastern Empire.   Hort had posed two 'recensions', first the Lucian (assuming this was the pre-Byzantine text), and then a second 'recension' in the 4th century.   But who did a second major 'recension', and who imposed it upon the entire Greek-speaking world?   Why was there no resistance, and more importantly,  no historical record of this?  Chrysostom is supposed to fit the bill, but there is no evidence that this ever happened.  Chrysostom was a controversial figure in his own lifetime unpopular with the Emperor and many other bishops, and could not have imposed a uniform text in the East.


What about Carson's second idea? 
(2) the restriction and displacement of the Greek language.   

How can this mechanism reverse the position of majority and minority readings?   In fact, it can't. It is no mechanism at all.  Of course the Latin language finally dominated Western Europe, while the Greek, formerly the dominant international language, faded from the stage.  But this provides no mechanism to flip readings upside-down:

Certainly the percentage of total manuscripts changed, with Latin manuscripts gradually outnumbering and even overwhelming Greek manuscripts:
Greek and Latin MSS:  Click to Enlarge

However, both copying-streams were essentially separate, and both streams were normal.    There is nothing in this situation that can explain a reversal of minority and majority readings.  

Certainly the Latin MSS (Old Latin / Vulgate) already had essentially the same text as the Greek Byzantine MSS.   This means that changes in the relative percentage of the MSS would not significantly affect the attestation of majority and minority readings. 

The only thing that could happen from the change-over from Greek to Latin, is that a few of the so-called 'Western' readings would gain some support from the overwhelming numbers of Latin MSS.   But this has nothing to do with the Byzantine readings in the Greek transmission stream.  Very few if any Greek MSS in the Eastern Byzantine empire would ever have been corrected using Latin MSS!   The Greeks couldn't care less about the Latin texts being used in the West.

Nor can the "shrinking" of the Greek language and influence cause any reversals between minority and majority readings.  This is pure nonsense.  The raw manuscript count went continually upward, as churches, people and markets, and demand expanded.   What decreased here was the rate of increase, not the number manuscripts!

A 'shrinking' of influence would in any case cause a decrease in the score for Byzantine readings, not an increase!   D.A. Carson's logic is what is really upside-down here, not the minority/minority readings.

Nazaroo

(to be continued...)

Thursday, May 19, 2011

Majority Text: (IX): Analyzing Terms and Claims

When the Probability Arguments in favor of the majority readings were first described in detail by Hodges (Pickering, Identity of the NT Text, Appendix C), they were attacked by D. A. Carson and others, who essentially abandoned any precise Divine Preservation of the NT text. 



What is 'Normal' Transmission?

Carson adopted the 19th cent. materialist/rationalist view, that there was nothing miraculous in the textual transmission process: there was no special 'Divine control' over copying; -  i.e., no supervision, influence or interference by God to protect the exact wording throughout the ages.

Because there was nothing immediately detected in the copying process to distinguish it on a supernatural basis, 19th century critics were convinced there was no such influence.  God was an 'unnecessary hypothesis' for a "scientific" description of textual transmission.  For these investigators, the existence of copying errors and corrections in all manuscripts was taken as evidence against any supernatural intervention.   The copyists were on their own.


Materialism Remains Unproven

The anti-supernatural attitude was prevalent throughout the late 19th and early 20th century.  But in spite of the failure of scientific methods to detect non-material effects, the question of supernaturalism vs. materialism has proven to be a most difficult if not insoluble philosophical problem.   The caution is this: just because something is not obvious, observable or easy to detect doesn't mean it has no existance.   The same 19th century skepticism would have also rejected radio communication, and atomic bombs.   Finally, anti-supernaturalism itself has no place in Christian faith systems.   Belief in an invisible God who intervenes in history is fundamental to both Christianity and Judaism.


The Meaning of 'Normal' in the Probability Model

While textual critics have used the word 'normal' in the sense described above, we must note that it has an entirely different meaning in discussions of the Majority Text Probability Argument:  In this context, 'normal' just means an average process, following a predictable pattern with expected results.  'abnormal' would not mean 'supernatural', but rather it would be used to describe any unusual process or anomaly which resulted in an unexpected outcome.  

The Probability Model does not address the question of 'supernatural' vs. 'materialism' .  It is not concerned with causes at all.   It is strictly a descriptive model that makes only basic mechanical assumptions about the process, such as the limits of time-direction, the consequences of ordinary transcriptional probabilities, and the effects of processes on statistical results.  As such, the Probability Model is not a 'supernatural' theory, and it makes the same assumptions about the ordinary world that every other scientific model does.  For purposes of analysis, the Probability Model assumes that errors are 'random' undirected events, just as other scientific models would.   But "undirected" here simply means that a process is not under control of a person or cause which would unnaturally skew ordinary physical events.   So this model is not any kind of argument in favor of supernaturalism:  instead it allows the same variety of world-views that other models do.

Because of this approach, the Probability Model cannot offer direct 'proof' of God's providence or Divine Preservation.  It can only offer objective evidence which proponents of such philosophical positions can find either compatible or incompatible with their system of philosophy.   So it is not the responsibility of proponents of the Probability Model to defend supernaturalism, or even interpret its findings in the light of various world-views.  That must be left to others, theologians and philosophers, and investigators of the supernatural.

What the Probability Model can do, is offer a coherent and rational description of the copying process, and from this, evaluate various text-types in a history of textual transmission and also assist in the reconstruction of original text(s). 

(to be continued...)

Sunday, May 8, 2011

Majority Text (VIII): Cross-Pollenation - Correction and Mixture

It has been claimed in the past that the problem of "mixture", (the correction of manuscripts and copying of readings across genealogical lines) negates or destroys any genealogical arguments and claims.

This is simply not true, and shows a poor understanding of the real effects of such activity.   Consider first of all, the simple act of double-checking, proof-reading a copy, against its own master-copy.  This action will very rarely introduce further errors, but most often and quite frequently will simply correct copying mistakes from the 'first pass'.    The effect of error-checking and correction is quite predictable:  The rate of accumulation of errors is drastically reduced.

Error-correction has the main effect of severely retarding any corruption over copying generations, and greatly extending the staying-power of the original readings;  error-checking always increases the percentage score of any and every majority reading (i.e., correct reading).

What happens, however with true "mixture", where readings cross into parallel transmission lines?  The answer is similar, but has an added complexity:

Transmission Model with MIXTURE (click to enlarge)
"Mixture" occurs when a manuscript is corrected from some other copy not involved in or descended from its own transmission branch.   This happens just as in ordinary correction, but now, readings not found in the master-copy can enter into the manuscript and continue in the copying stream.

In the above diagram, blue lines indicate "successful" corrections, that is, cases where the corrector was himself correct in making the change.  Red lines show places where an incorrect reading was copied into a manuscript that originally had the correct one.   We have allowed that good corrections will occur slightly more often than bad ones, which is a reasonable expectation.

On the top-right, a copy containing the Green Packet gets corrected from a very old copy, and has its "Green" readings restored (it now becomes white).  This is one of the most likely scenarios, since early copyists will naturally assume older manuscripts are more accurate (just like modern critics do!).  As a result, the "Green Packet" loses many votes that would have accumulated from this copy.  The errors become even smaller minority readings.   Now we allow a Yellow Packet  to be 'corrected' by a faulty Green Packet, which now carries both Yellow and Green errors.  This will not compensate for the loss of an earlier Green Packet, because it comes later.   It fathers instead a peculiar minority 'text-type' or family with mixed readings.

Now on the left side, an early mistake is made: a Yellow Packet copy is 'corrected' by an Orange Packet copy, resulting in a boost of Orange Packet readings.   The Yellow Packet readings are unaffected.  Even if this corrupted copy is recopied twice more (not shown), The Orange Packet manuscripts will only amount to 10 copies out of 26 (38% up 5% from 33%), staying minority readings.

Correcting a Red Packet copy using an Orange Packet copy does nothing for Orange readings however!  In this case, the Red Packet readings decrease, but the Orange readings were already in this copy, so there are no gains.  Its only the Red readings that get corrected.   Since this is more likely than not (similar copies will be in similar geographic regions), minority readings will lose out more than half the time.  In this case, "Mixture" has only purified the transmission stream, and this is actually the most common scenario, even when correcting from diverse copies.

Again, when an Orange Packet copy is corrected by a Yellow Packet, the only net result is purification of the copying stream.   The errors in the Yellow Packet are already present in the Orange copy and no correction is made there.  Only Orange readings are removed.   It is perfectly reasonable and effective to correct a copy from another copy with errors.   The average result will not be any increase in errors, but usually only an exchange, with as many Error Packets getting corrected as there are Error Packets getting perpetuated.

The error-count within an Error-Packet is not relevant here (i.e., the 'size' of the Error Packet).   Of course Error Packets can be of different sizes and degrees of seriousness.    But they can only be transmitted manuscript to manuscript in groups, and each act of copying a manuscript must be treated as a single discrete event.  We cannot switch back and forth between Error-Packets and errors within a packet indiscriminately, as this would violate proper analysis of the error transmission process.

Again as in the non-Mixture model, varying copying rates only moderately affect minority readings, mostly in a random fashion and not with the consistency needed to cause minority readings to become majority readings.

(to be continued...)

Nazaroo

Monday, May 2, 2011

Chronology of Printed GNTs: 1600 - 1700

Click to Enlarge

The 17th Century (1600 - 1700)
The 17th century is characterized by increased interest and activity in collating existing and available manuscripts.  In part this was driven by doctrinal disputes between Protestants and Roman Catholic authorities, and in part by a desire to establish the authoritative text for the Reformers, as against temporal authorities like church organizations.   Awareness and interest in textual variants grew, especially as descrepancies arose between the popular texts and those of a few very ancient copies, such as Codex D (Bezae), Codex A (Alexandrinus), and Codex B (Vatican MS 1209).   Readings from these MSS as well as others were gathered into the evolving 'apparatus' of critical notes.

Theodore Beza, as noted in the previous post, began to gather and critically assess the readings of fresh MSS and also the most ancient in his possession, namely, Codex Bezae (Gospels, 4th-5th cent.) and Cantabrigensis (Acts, Epist.). 

Both Beza's and Stephen's texts were used, in combination with earlier translations, to produce the King James Bible, which became the defacto standard in English, supplanting the Bishop's Bible and other earlier works.

Brian Walton (1657) next published a polyglott (multi-language NT) with variants, incorporating readings from Codex Alexandrinus ("A" - the oldest MS then known), as well as collations of 16 new MSS from Archbishop Ussher.

John Fell (1675), then added a small edition with further collations and citations of the Memphitic (Lower Egyptian) and the ancient Gothic versions, made soon after the time of Constantine (5th century).   This was the first time a translation other than the Latin was used critically.

John Mill (1707), assisted by Bishop Fell, now produced the crowning achievement of the century, (d. 1686).  Mill still followed closely the text of Stephen, but with printing and other corrections.  Dr. Scrivener (1881) describes the work: "..Of the criticism of the NT in the hands of Dr. John Mill it may be said, that he found the edifice of wood, and left it marble."
Mill was aware of the danger of rash judgements, and like Maestricht, was more focussed on the classification of documents, a necessary preliminary to future critical editing.

The beginning of the 18th century also saw one bizzare and unfortunate additional publication, an attempt which would be repeated in subsequent centuries:

Nicholas Toinard (1707), a Roman Catholic priest from Orleans, simultaneously published alongside Mill: M. Vincent (1899) tells us; 
"Toinard was the first Roman Catholic since Erasmus, and the last before Scholtz (1830), who undertook a critical edition. In his Prolegomena he announces that he has made a Greek Testament according to the two oldest Vatican codices and the Old Latin Version, where it agreed with them. He was thus working on the same principle afterward proposed by Bentley." 

Toinard however had also previously published a 'Harmony' of the Gospels:
"Locke's interest in the harmony of the Gospel narrative was quickened during his travels in France, when he met Nicolas Toinard. In December 1678, Toinard presented him with the sheets of his Harmony of the Gospels; and in the same year, Locke inscribed in a notebook a fragment of a harmony of the life of Jesus.39 This chronology of the history of Jesus, from the annunciation of the birth of John the Baptist to Jesus’ baptism by John, follows Toinard. Noteworthy in the sequence of texts is the location of the prologue to St John’s Gospel. Locke places it, following Toinard, after the baptism. While the relocation of the prologue may raise suspicions of Socinianism, Toinard’s accompanying comment, that the prologue, even although relocated in the history of the Gospel, signifies ‘the eternal and divine origin of the word, that is, of Jesus Christ’, offers a ready, although perhaps insufficient, assurance of orthodoxy."
(Christianity, Antiquity, and Enlightenment: Interpretations of Locke, V. Nuovo, (Springer, 2011), Ch. 2: Locke's Theology)
 This same Toinard then, seems to be the "M. Toinard" [bad scan?] mentioned but misnamed in  Illustrations of Biblical Literature,....Rev. James Townley, (1833?) p. 28:
"In some parts of the Pentateuch, transpositions appear to have taken place, by which the chronological order is interrupted; ... Father Simon, and Dr. A. Clarke suppose, that by being inscribed upon leaves, or portions of bark or papyrus, the [pages] were very liable to be deranged, especially as [they lacked pagination].  But Dr. Kennicott conjectures, that many of the first manuscripts were upon skins sewed together; and that these transpositions were occasioned by the skins being separated, and afterward misplaced; and finds a singular instance in a roll preserved in the Bodleian library, at Oxford.   Mr. Whiston and M. Toinard have attempted to prove similar transpositions in the NT, from the same cause; but have been successfully refuted by the Rev. Jeremiah Jones, in his  Vindication of the former part of St. Matthew's Gospel, ch. xiv."

The key point here is that long before there were any coherent theories or developed practices (i.e., a scientific methodology), we have a Roman Catholic priest proposing abandoning all evidence except the two oldest manuscripts (conveniently owned by the Vatican) and the Old Latin.   As Vincent noted, this is the same proposal as Richard Bentley (1716), only it clearly originates 10 years earlier with a Roman Catholic committed to reversing the Protestant Reformation.   Toinard's wild ideas concerning the drastic rearrangement of John's Gospel is also a serious red flag, indicating typical Roman Catholic flights of fancy, later to be picked up and carried out by Bultmann (1941).

mr.scrivener