Interlanguage Development: Some Evidence


As a follow-up to my two previous posts, here’s some information about interlanguage development.

Doughty and Long (2003) say

There is strong evidence for various kinds of developmental sequences and stages in interlanguage development, such as the well known four-stage sequence for ESL negation (Pica, 1983; Schumann, 1979), the six-stage sequence for English relative clauses (Doughty, 1991; Eckman, Bell, & Nelson, 1988; Gass, 1982), and sequences in many other grammatical domains in a variety of L2s (Johnston, 1985, 1997). The sequences are impervious to instruction, in the sense that it is impossible to alter stage order or to make learners skip stages altogether (e.g., R. Ellis, 1989; Lightbown, 1983). Acquisition sequences do not reflect instructional sequences, and teachability is constrained by learnability (Pienemann, 1984).

Let’s take a look at the “strong evidence” referred to, beginning with Pit Corder and error analysis.


Pit Corder: Error Analysis

Corder (1967) argued that errors were neither random nor systematic results of L1 transfer; rather, they were indications of learners’ attempts to figure out an underlying rule-governed system. Corder distinguished between errors and mistakes: mistakes are slips of the tongue, whereas errors are indications of an as yet non-native-like, but nevertheless, systematic, rule-based grammar. Interesting and provocative as this was, error analysis failed to capture the full picture of a learner’s linguistic behaviour. Schachter (1974) compared the compositions of Persian, Arabic, Chinese and Japanese learners of English, focusing on their use of relative clauses. She found that the Persian and Arabic speakers had a far greater number of errors, but she went on to look at the total production of relative clauses and found that the Chinese and Japanese students produced only half as many relative clauses as did the Persian and Arabic students. Schachter then looked at the students’ L1 and found that Persian and Arabic relative clauses are similar to English in that the relative clause is placed after the noun it modifies, whereas in Chinese and Japanese the relative clause comes before the noun. She concluded that Chinese and Japanese speakers of English use relative clauses cautiously but accurately because of the distance between the way their L1 and the L2 (English) form relative clauses. So, it seems, things are not so straightforward: one needs to look at what learners get right as well as what they get wrong.


The Morpheme Studies

Next came the morpheme order studies. Dulay and Burt (1974a, 1974b) claimed that fewer than 5% of errors were due to native language interference, and that errors were, as Corder suggested, in some sense systematic, that there was something akin to a Language Acquisition Device at work not just in first language acquisition, but also in SLA.

The morpheme studies of Brown in 1973 resulted in his claim that the morphemes below were acquired by L1 learners in the following order:

1 Present progressive (-ing)

2/3 in, on

4 Plural (-s)

5 Past irregular

6 Possessive (-’s)

7 Uncontractible copula (is, am, are)

8 Articles (a, the)

9 Past regular (-ed)

10 Third person singular (-s)

11 Third person irregular

12 Uncontractible auxiliary (is, am, are)

13 Contractible copula

14 Contractible auxiliary

This led to studies in L2 by Dulay & Burt (1973, 1974a, 1974b, 1975), and Bailey, Madden & Krashen (1974), all of which suggested that there was a natural order in the acquisition of English morphemes, regardless of L1. This became known as the L1 = L2 Hypothesis, and further studies (by Ravem (1974), Cazden, Cancino, Rosansky & Schumann (1975), Hakuta (1976), and Wode (1978) all pointed to systematic staged development in SLA.

Some of these studies, particularly those of Dulay and Burt, and of Bailey, Madden and Krashen, were soon challenged, but over fifty L2 morpheme studies have since been carried out using more sophisticated data collection and analysis procedures, and the results of these studies have gone some way to restoring confidence in the earlier findings.


Selinker’s Interlanguage.

The third big step was Selinker’s (1972) paper, which argues that the L2 learners have their own autonomous mental grammar (which came to be known as interlanguage grammar), a grammatical system with its own internal organising principles. One of the first stages of this interlanguage to be identified was that for ESL questions. In a study of six Spanish students over a 10-month period, Cazden, Cancino, Rosansky and Schumann (1975) found that the subjects produced interrogative forms in a predictable sequence:

  1. Rising intonation (e.g., He works today?),
  2. Uninverted WH (e.g., What he (is) saying?),
  3. “Overinversion” (e.g., “Do you know where is it?),
  4. Differentiation (e.g., “Does she like where she lives?).

Then there was Pica’s study of 1983 which suggested that learners from a variety of different L1 backgrounds go through the same four stages in acquiring English negation:

  1. External (e.g., No this one./No you playing here),
  2. Internal, pre-verbal (e.g., Juana no/don’t have job),
  3. Auxiliary + negative (e.g., I can’t play the guitar),
  4. Analysed don’t (e.g., She doesn’t drink alcohol.)

Apart from these two examples, we may cite the six-stage sequence for English relative clauses (see Doughty, 1991 for a summary) and sequences in many other grammatical domains in a variety of L2s (see Johnston, 1997).


 Pienemann’s 5-stage Sequence.

Perhaps the most extensive and best-known work in this area has been done by Pienemann whose work on a Processability Theory started out as the Multidimensional Model, formulated by the ZISA group mainly at the University of Hamburg in the late seventies. One of the first findings of the group was that all the children and adult learners of German as a second language in the study adhered to the five-stage developmental sequence shown below:

Stage X – Canonical order (SVO)

die kinder spielen mim bait //// the children play with the ball

(Romance learners’ initial SVO hypothesis for GSL WO is correct in most German sentences with simple verbs.)

Stage X + I – Adverb preposing (ADV)

da kinder spielen //// there children play

(Since German has a verb-second rule, requiring subject—verb inversion following a preposed adverb {there play children), all sentences of this form are deviant. The verb-second (or ‘inversion’) rule is only acquired at stage X + 3, however. The adverb-preposing rule itself is optional.)

Stage X + 2 – Verb separation (SEP)

alle kinder muss die pause machen //// all children must the break have

(Verb separation is obligatory in standard German.)

Stage X+3 – Inversion (INV)

dam hat sie wieder die knock gebringt //// then has she again the bone brought

(Subject and inflected verb forms must be inverted after preposing of elements.)

Stage X+4 – Verb-end (V-END)

er sagte, dass er nach house kommt //// he said that he home comes

(In subordinate clauses, the finite verb moves to final position.)

Learners did not abandon one interlanguage rule for the next as they progressed; they added new ones while retaining the old, and thus the presence of one rule implies the presence of earlier rules.

A few words about the evidence. There is the issue of what it means to say that a structure has been acquired, and I’ll just mention three objections that have been raised. In the L1 acquisition of morphemes, a structure was assumed to be acquired when it occurred three times in a row in an obligatory context at a rate of 90%. The problem with such a measurement is, first, how one defines an “obligatory” context, and second, that by only dealing with obligatory contexts, it fails to look at how the morphemes might occur in incorrect contexts. The second example is that Pienemann takes acquisition of a structure as the point at which it emerges in the interlanguage, its first “non-imitative use”, which many say is hard to operationalise. A third example is this: in work reported by Johnson, statistical measures using an experimental group of L2 learners and a control group of native speakers have been used where the performance of both groups are measured, and if the L2 group performance is not significantly different from the control group, then the L2 group can be said to have acquired the structure under examination. Again, one might well question this measure.

To return to developmental sequences, by the end of the 1990s, there was evidence of stages of development of an interlanguage system from studies in the following areas:

  • morphemes,
  • negation,
  • questions,
  • word order,
  • embedded clauses
  • pronouns
  • references to the past



Together these studies lend very persuasive support to the view that L2 learners follow a fairly rigid developmental route. Moreover, it was seen that this developmental route sometimes bore little resemblance to either the L1 of the learner, or the L2 being learnt. For example, Hernández-Chávez (1972) showed that although the plural is realised in almost exactly the same way in Spanish and in English, Spanish children learning English still went through a phase of omitting plural marking. It had been assumed prior to this that second language learners’ productions were a mixture of both L1 and L2, with the L1 either helping or hindering the process depending on whether structures are similar or different in the two languages. This was clearly shown not to be the case. All of which was taken to suggest that SLA involves the development of interlanguages in learners, and that these interlanguages are linguistic systems in their own right, with their own sets of rules.

There are lots of interesting questions and issues that I haven’t even mentioned here about interlanguage development in general and about orders of acquisition in SLA in particular. It’s worth pointing out that Corder’s and Selinker’s initial proposal of interlanguage as a construct was an attempt to explain the phenomenon of fossilisation. As Tarone (2006) says:

Second language learners who begin their study of the second language after puberty do not succeed in developing a linguistic system that approaches that developed by children acquiring that language natively. This observation led Selinker to hypothesize that adults use a latent psychological structure (instead of a LAD) to acquire second languages.  

The five psycholinguistic processes of this latent psychological structure that shape interlanguage  were hypothesized (Selinker, 1972) to be (a) native language transfer, (b) overgeneralization of target language rules, (c) transfer of training, (d) strategies  of communication, and (e) strategies of learning.

It wasn’t long before Krashen’s Monitor Model claimed that there was no evidence of L1 transfer in the morpheme studies, denied the central role of L1 transfer which the original Interlanguage Hypothesis gave it, and also denied that there were sensitive (critical) periods in SLA. Generativist studies of SLA also minimised the role of L1 transfer. And there have been some important updates on the interlanguage hypothesis since the 1980s, too (see Tarone (2006) and Hong and Tarone (2016) for example).

My main concern in discussing interlanguage development, as you must be all too well aware by now, is to draw attention to the false assumptions on which coursebook-based ELT are based. Coursebooks assume that structures can be learned on demand. If this were the case, then acquisition sequences would reflect the sequences in which coursebooks present them, but they do not. On the contrary, the acquisition order is remarkably resilient to coursebook presentation sequences. Long (2015, p. 21) gives some examples to demonstrate this:

…. Pica (1983) for English morphology by Spanish-speaking adults, by Lightbown (1983) for the present continuous -ing form by French-speaking children in Quebec being taught English as a second language (ESL) using the Lado English series, by Pavesi (1986) for relative clauses by children learning English as a foreign language (EFL) in Italy and Italian adults learning English naturalistically in Scotland, and by R. Ellis (1989) for English college students learning word order in German as a foreign language.

Long goes on to point out that accuracy orders and developmental sequences found in instructed settings match those obtained for the same features in studies of naturalistic acquisition, and that the striking commonalities observed suggest powerful universal learning processes are at work. He concludes (Long, 2015, p.23):

… instruction cannot make learners skip a stage or stages and move straight to the full native version of a construction, even if it is exclusively the full native version that is modelled and practiced. Yet that is what should happen all the time if adult SLA were a process of explicit learning of declarative knowledge of full native models, their comprehension and production first proceduralized and then made fluent, i.e., automatized, through intensive practice. One might predict utterances with occasional missing grammatical features during such a process, but not the same sequences of what are often completely new, never-modelled interlingual constructions, and from all learners.

While practice has a role in automatizing what has been learned, i.e., in improving control of an acquired form or structure, the data show that L2 acquisition is not simply a process of forming new habits to override the effects of L1 transfer; powerful creative processes are at work. In fact, despite the presentation and practice of full native norms in focus-on-forms instruction, interlanguages often stabilize far short of the target variety, with learners persistently communicating with non-target-like forms and structures they were never taught, and target-like forms and structures with non-target-like functions (Sato 1990).



That’s a taste of the evidence. We can’t conclude from it, as a few insist, that there’s no point in any kind of explicit teaching, but it does mean that, in Doughty and Long’s words (2003):

The idea that what you teach is what they learn, and when you teach it is when they learn it, is not just simplistic, but wrong.

The dynamic nature of SLA means that differentiating between different stages of interlanguage development is difficult – the stages overlap, and there are variations within stages – and so the simplistic view of a “Natural Order”, where a learner starts from Structure 1 and reaches, let’s say, Structure 549, is absurd. Imagine trying to organise stages such as those identified by Pienemann into ordered sets! As Gregg (1984) points out:

If the structures of English are divided into varying numbers of ordered sets, the number of sets varying according to the individual, then it makes little sense to talk about a ‘natural order’. If the number of sets varies from individual to individual; then the membership of any given set will also vary, which makes it very difficult to compare individuals, especially since the content of these sets is virtually completely unknown.

So the evidence of interlanguage development doesn’t mean that we can design a syllabus which coincides with any “natural order”, but it does suggest that we should respect the learners’ internal syllabuses and their developmental sequences, which most coursebooks fail to do. Doughty and Long (2003) argue that the only way to respect the learner’s internal syllabus is

by employing an analytic, not synthetic, syllabus, thereby avoiding futile attempts to impose an external linguistic syllabus on learners (e.g., the third conditional because it is the third Wednesday in November), and instead, providing input that is at least roughly tuned to learners’ current processing capacity by virtue of having been negotiated by them during collaborative work on pedagogic tasks.

Long has since (Long, 2015) given a full account of his own version of task-based language teaching, and whether or not we are in a position to implement a similar methodology in our own teaching situations, at least we can agree that we’d be well-advised to concentrate more on facilitating implicit learning than on explicit teaching, to give more carefully-tuned input, and to abandon the type of synthetic syllabus used in coursebooks in favour of an analytic one.



Sorry, can’t give all references. Here are a few of “key” texts. Tarone (2006) free to download (see below) is a good place to start.

Adjemian , C. (1976) On the nature of interlanguage systems. Language Learning 26, 297–320.

Bailey,N., Madden, C., Krashen, S. (1974) Is there a “natural sequence” in adult second language learning? Language Learning 24, 235-243.

Corder, S. P.  (1967) The  significance  of  learners’ errors. International Review of

Applied Linguistics (IRAL) 5, 161-9.

Corder, S. P. (1981) Error analysis and interlanguage. Oxford: Oxford University Press.

Dulay, H. and Burt, M. (1974a) Errors and strategies in child second language acquisition. TESOL Quarterly 8, 12-36.

Dulay, H. and Burt, M. (1974b) Maturational sequences in child second language acquisition. Language Learning 24, 37-53.

Doughty, C. and Long, M.H. (2003) Optimal Psycholinguistic Environments for Distance Foreign Language Learning. Downloadable here:

Gregg, K. R. (1984) Krashen’s monitor and Occam’s razor. Applied Linguistics 5, 79-100.

Hong, Z. and Tarone, E. (Eds.) (2016) Interlanguage Forty years later. Amsterdam, Benjamins.

Krashen S (1981) Second language acquisition and second language learning.  Oxford: Pergamon Press.

Long, M. H. (2015) SLA and TBLT. Oxford, Wiley Blackwell.

Nemser W (1971) Approximative systems of foreign language learners.’ IRAL 9, 115–23.

Selinker L (1972). ‘Interlanguage.’ IRAL 10, 209–231.

Selinker L  (1992).  Rediscovering interlanguage.   London: Longman.

Schachter, J. (1974) An error in error analysis. Language Learning 24, 3-17.

Tarone E (1988) Variation in interlanguage. London: Edward Arnold.

Tarone, E. (2006) Interlanguage. Downloadable here:



What good is relativism?


Scott Thornbury (2008) asks “What good is SLA Theory?” . This is a question beloved of populists, all of whom agree that it’s of no use to anyone, except the rarefied crackpots who dream it up. Thornbury sets the tone of his own populist piece by saying that most teachers display a general ignorance of, and indifference to, SLA theory, due to “the visceral distrust that most practitioners feel towards ivory-tower theorising”.  If he’d said that most English language teachers have an ingrained distrust of academic research into language learning, we might have asked him for some evidence to support the assertion, but who can question that ivory tower theorists are not to be trusted? Note how Thornbury, who teaches a post-graduate course on theories of SLA at a New York university, and who has published many articles in serious, peer-reviewed journals, smears academics with the “ivory tower” brush, while himself sidling up to the hard-working, down to earth sceptics who read the English Teaching Professional magazine

Thornbury gives a brief sketch of 4 types of SLA theory and then gives 4 reasons why “knowledge of theory” is a good thing for teachers. But you can tell that his heart’s not in it.  He knows perfectly well that “knowledge” of the theories of SLA he mentions is of absolutely no use to anybody unless those theories are properly scrutinised and evaluated, but, rather than attempt any such evaluation, Thornbury prefers to devote the article to reassuring everybody that there’s no need to take SLA theories too seriously.

To help him drive home this anti-intellectual message, Thornbury turns to “SLA heavyweight” John H Schumann. Most SLA scholars regard the extreme relativist position Schumann adopts in his 1983 article as almost comically preposterous, while his acculturation theory is about as “heavyweight” as Dan Brown’s theory of the Holy Grail.  But anyway, judge for yourself.  Schumann (1983) suggests that theory construction in SLA should be regarded not as a scientific task, but as a creative endeavour, like painting. Rather than submitting rival theories of SLA to careful scrutiny, looking for coherence, logical consistency and empirical adequacy, for example, Schumann suggests that competing theories of SLA should be evaluated in the same way that one might evaluate different paintings.

“When SLA is regarded as art not science, Krashen’s and McLaughlin’s views can coexist as two different paintings of the language learning experience… Viewers can choose between the two on an aesthetic basis favouring the painting which they find phenomenologically  true to their experience.”

Thornbury seems to admire this suggestion. He comments:

“This is why metaphors have such power. We tend to be well disposed to a theory if its dominant imagery chimes with our own values and beliefs. If we are inclined to think of learning as the meeting of minds, for example, an image such as the Zone of Proximal Delevopment is more likely to attract us than the image of a black box.”

Schumann’s paper was an early salvo in what, 10 years later, turned into a spirited war between academics who adopted a relativist epistemology; and those who held to a rationalist epistemology. The war is still waging, and, typically enough, Thornbury stays well clear of the front line, while maintaining friendly relations with both camps. But let’s be clear: relativism, even though not often taken to the extreme that Schumann does, is actually taken seriously by many academics, including Larsen-Freeman and sometimes (depending on how the wind’s blowing) by Thornbury himself. Rational criteria for the evaluation of rival theories of SLA, including logical consistency and the weighing of empirical evidence, are abandoned in favour of the “thick description” of different “stories” or “narratives”,  all of them deemed to have as much merit as each other. Relativists suggest that trying to explain SLA in the way that rationalists (or “positivists” as they like to call them) do is no more than “science envy”, and basically a waste of time. Which is actually the gist of Thornbury’s argument in the 2008 article discussed here.

In response to this relativist position, let me quote Larry Laudan, who says

“The displacement of the idea that facts and evidence matter by the idea that everything boils down to subjective interests and perspectives is—second only to American political campaigns—the most prominent and pernicious manifestation of anti-intellectualism in our time.”


Thornbury asks “What good is SLA theory?” without making any attempt to critically evaluate the rival theories he outlines. But then, why should he? After all, if you adopt a relativist stance, then no theory is right, none is of much importance, so why bother to sort them out? Instead of going to all that unnecessary trouble, all you have to do is take a quick look at Thornbury’s little summary in Table 1 and choose the theory that grabs you, or rather, choose the “dominant metaphor” which best chimes with your own values and beliefs. And if you can’t be bothered to check out which theory goes best with your values and beliefs, then why not use some other, equally arbitrary subjective criterion? You could toss a coin, or stare intently at a piece of toast, or ask Jeremy Harmer.

“What good is SLA theory?” is actually a very stupid question. It’s as if “SLA theory” were some sort of uncountable noun, like toothpaste. What good is toothpaste? It doesn’t actually make much difference to brushing your teeth. But “SLA theory” is not uncountable; some SLA theories are very bad, and some are very good, and consequently, we need to agree on criteria for evaluating them so as to concentrate on what we can learn from the best theories. Instead of pandering to the misinformed view that SLA theories are equally unscientific, equally based on metaphors, equally relative in their appeal, Thornbury could have used the space he had in the journal to examine – however “lightly”- the relative merits of the theories he discusses, and the usefulness to teachers of the best theories.  He could have mentioned some of the findings of psycholinguistic research into the influence of the L1; age differences and sensitive periods; error correction; incomplete trajectories; explicit and implicit learning, and much besides. He could have mentioned one or two of the most influential current hypotheses about SLA, for example that instruction can influence the rate but not the route of interlanguage development.

He could have also pointed out that those adopting a relativist epistemology have achieved very little; that Larsen-Freeman’s exploration of complexity theory has achieved precisely nothing; that his own attempts to use emergentism to conjure up “grammar for free” have been equally woeful; and that the relativists he supports are more responsible than anyone else for the popular view that academics sit in an ivory tower writing unintelligible articles packed with obscurantist jargon for publication in journals that only they bother to read.


Laudan, L. (1990) Science and Relativism: Dialogues on the Philosophy of Science. Chicago, Chicago University Press.

Schumann, J. H. (2003) Art and Science in SLA research. Language Learning, 33, 409 – 75.

Why PPP makes no sense at all. A reply to Anderson




I made a comment on Jason Anderson’s Blog in reply to his post The PPP Saga Ends  It hasn’t appeared, so here’s an amended version.

Hi Jason,

An interesting journey, and it makes good reading. You make an impressive attempt to defend the indefensible, and there are lots of good references, even if you play fast and loose with what your sources actually say.

To the issues, then.

First, let’s establish what we know about the SLA process after 50 years of SLA research. Students do not learn target forms and structures when and how a teacher decrees that they should, but only when they are developmentally ready to do so. Studies in interlanguage development have shown conclusively that L2 learners exhibit common patterns and features across differences in learners’ age and L1, acquisition context, and instructional approach. Independent of those and other factors, learners pass through well-attested developmental sequences on their way to mastery of target-language structures, or, as is often the case, to an end-state short of mastery.

Acquisition of grammatical structures (and also of pronunciation features and some lexical features such as collocation), is typically gradual, incremental and slow, sometimes taking years to accomplish. Development of the L2 exhibits plateaus, occasional movement away from, not toward, the L2, and U-shaped or zigzag trajectories rather than smooth, linear contours. No matter what the order or manner in which target-language structures are presented to them by teachers, learners analyze the input and come up with their own interim grammars, the product broadly conforming to developmental sequences observed in naturalistic settings. They master the structures in roughly the same manner and order whether learning in classrooms, on the street, or both.

That’s what we know. As a result this statement is plain wrong:

while research studies conducted between the 1970s and the 1990s cast significant doubt on the validity of more explicit, Focus on Forms-type instruction such as PPP, more recent evidence paints a significantly different picture.

It does not. No study conducted in the last 20 years has come up with evidence to challenge the established claim that explicit focus on forms such as PPP can do nothing to alter the route of interlanguage development. As Ortega (2009), in her summary of SLA findings states

Instruction cannot affect the route of interlanguage development in any significant way.

Teaching is constrained by the learners’ own powerful cognitive contribution, and to assume that learners will learn what they’re taught when they’re taught it using a PPP paradigm is false.

These statements are also false:

  • we have no evidence that PPP is less effective than other approaches
  • writers in academia have neither evidence nor theoretical justification for criticising coursebook writers 
  • The research on which writers such as Michael Long have based their promotion of focus on form is scant

But let’s get to the heart of the matter, which is really quite simple. You base your arguments on a non-sequitur  that appears throughout your paper. It’s this:

There is evidence to support explicit (grammar) instruction, therefore there is evidence to support the “PPP paradigm”.

It’s generally accepted, a non-controversial opinion, that explicit instruction has an important role to play in classroom-based SLA, but it doesn’t follow that PPP is a good approach to classroom based ELT.  PPP runs counter to a mass of SLA research findings, and that’s that. There is nothing, I repeat nothing, in “recent evidence from research studies” that supports PPP as an approach to classroom teaching.  You appeal to evidence for the effectiveness of explicit grammar teaching to support the argument that students will learn what they’re taught in class by a teacher implementing a synthetic syllabus, based on the presentation, practice and production of a sequence of chopped up bits of the language, thus making a schoolboy error in logic.

The rest of your paper says absolutely nothing to rescue a PPP approach from the fundamental criticism that students don’t learn an L2 in the way it assumes they do. The paper consists of a series of non-sequiturs and unsupported assertions which attempt to argue that the way the majority of institutions go about ELT is necessarily the best way.

To say that the PPP approach is popular with students and that coursebooks are consumer-driven, and that PPP is attractive to low income countries, and that this is evidence to support a “PPP paradigm” is patently ridiculous. The remarks about low income countries are also patronising and arrogant. You make a naive appeal to an “apples and pears” group of factors that need to be carefully examined and distinguished. I won’t go into any proper analysis now, but, just for example, the multi billion dollar ELT coursebook industry is not so much driven by the opinions of the end users, as by the language teaching institutions, both public and private, that deliver foreign language courses to them. For these institutions, the coursebook is convenient – it packages the otherwise “messy” thing that is language learning.  Which is not to say that it wouldn’t be cheaper, better, more efficient, and more rewarding for everybody if the coursebook were abandoned in favour of locally-produced materials used in a more learner centred approach.

Likewise, to say in reply to Neill that

the notion of ‘linear progress’ is a reflection of a much wider tendency in curricula and syllabi design. Given that the vast majority of English language teaching in the world today is happening in state-sponsored primary and secondary education, where national curricula perform precisely this role, we can predict to a large extent that top down approaches to language instruction are going to dominate for the foreseeable future

is to give absolutely no justification for such top down approaches to language instruction. Yes, as a matter of fact, they dominate ELT today, but that’s no argument in their favour, now is it?

You fail to address the arguments for a learner-centred approach, or any version of the process syllabus suggested by Breen. Those of us who oppose PPP do so not only because it contradicts what we know about SLA, but also because it adopts a pedagogy where students are given no say in the decisions that affect their learning, where the commodification of education goes unchallenged, and where Friere’s “banking” view of education rules. To oppose the way ELT is currently organised is not unrealistic, any more than opposing the privatisation of education in the UK is; but it is difficult. Whatever ones’ views, the kind of faux academic baloney present in your paper really doesn’t help.

Finally, your long quote from Ur in reply to Neill is just one more example of argument by assertion. She’s good at this kind of stuff, and I’m not surprised that you like it, but it’s pure rhetoric. She says “such features as students’ socio-cultural background, relationships, personalities; motivation;” etc., etc,  “often actually have more influence on how grammar is taught, and whether it is successfully learnt, than any of those dealt with in research”. This ignores all the research that has been done into those features, and provides no evidence or arguments to challenge SLA research findings with regard to the development of interlanguages.


Ortega (2009) “Sequences and processes in language learning”. In Long and Doughty (2009) Handbook of Language Teaching. Wiley

Two Plenaries at the Chile IATEFL Conference, July 2016


I’ve just been watching YouTube videos of the IATEFL Chile Conference that took place in July. I recommend that you watch them, because they demonstrate just how much we need new organisations to represent  teachers. The conference plenaries show the same old faces trotting out the same old stuff, and there’s absolutely nothing here to make your heart sing, or, more mundanely, to make you think that the raft of real teachers’ concerns are being addressed . Did anybody mention the miserable pay that millions of teachers get, or zero hour contracts without pension rights, or appalling conditions of work? Did anybody question how teacher qualifications are decided on, or how professional development is organised? Was there any mention of teachers’ feeling of worth?  Did anybody question the IATEFL statutes? Well of course not, because that’s not what the carefully chosen plenary speakers are here to do.

What we see in these videos is a show, a promotion of the stars of ELT who are supposed to enrich the lives of teachers in much the same way as going to see any other celebrity “live” is supposed to do. It’s a travesty of what a conference of working teachers should be. It’s proof, as if proof were needed, of the commercialisation of ELT.


Scott Thornbury gave two talks that he’d done before.  His review of the history of ELT was a repeat of the plenary he gave just a few months previously at the IATEFL international conference, and his talk on his attempts to improve his Spanish was a version of what he’d already said years before. Like so many of the army of professional speakers who tour the world, Scott is almost expected to trot out the same old stuff time and time again. Like Elton John singing Candle in the wind, or Tony Blair chanting I’d do it all again, the audience doesn’t even expect to hear anything new; they just want to be in the audience where the celebrity entertains them. How long before Scott has to autograph the IATEFL programme pushed towards him by admiring fans as he leaves the stage?


Now guess who else gave a plenary in Chile. Guess who the organisers thought was worth flying 9,000 kilometres to address their teachers. Why, who else than that rightly revered, roundly respected, super scholar Jeremy Harmer! And once again Harmer demonstrated his uncanny ability to insult his audience’s intelligence without being booed off the stage. This time, Harmer chose to defend the coursebook, in a plenary titled Back between the covers: should coursebooks exist in a modern age.  Please, before you do anything else, watch it by clicking on this link.

What did you make of that hour long talk? Maybe you can use it in some teacher training programme. Get everybody comfortably seated, play the video, and use this worksheet.


  • How many times does he lose the thread?
  • As a sub-set, how many times does he confess that he can’t remember what he’s talking about?
  • How many times does he contradict himself?
  • How many times (to the nearest 100) does he not bother (sic) to finish a sentence?
  •  How many times does he not answer his own questions?
  • Is he bothered?


  • Give 5 examples of where he resorts to what he really, really sincerely believes rather than to what might pass for a reasoned argument.
  • Give 5 examples of how he misreprent the arguments he doesn’t like.
  • Give 5 examples of where he shows an ignorance of emergentism and interlanguage research.


  • How does Harmer come accross?
  • How does he treat his audience?


  • Give 1 example of something he said that you didn’t aleady know.
  • Summarise his argument for why coursebooks are useful.
  • Suggest what a plenary talk about the place of ELT coursebooks should discuss.

Now let me give my own view of the plenary. Harmer doesn’t inform or debate about the important issues involved, he blusters. From a discourse point of view, he looks to me like a confused, ill-prepared clown hired to appear at a 2 year old kid’s birthday party. Talk about impoverished input!  Nevertheless, observe his general stage manner.  It’s a display of authority: he knows he’s a powerful figure in ELT and he acts like it.

As to content, what did he say? Take away the endless pile of platitudes, ignore the sporadic Oh and by the way remarks, leave out the cascade of careless clichés and the endless homilies; in short, do away with the “noise” that always surrounds Harmer’s discourse as he stumbles around the stage like someone who can’t quite remember what he’s so urgently looking for, and what have you got? What do we get from all this pumped up but ultimately lifeless torrent of confident, disorganised clatter and chatter? What does it all mean? What does Harmer’s defence of coursebooks amount to?  Predictably, it amounts to almost nothing. He gave an absurd summary of the arguments against them and then took the audience through some exercises to show that talking about music can be fun. From this he concluded with a trite re-hash of the old chestnut that it’s not the coursebook, it’s what you do with it.

“Two plenaries do not a conference make”, you may say. Quite right, and for all I know, great things might have gone on at the conference. But the plenaries do, I suggest, say a lot about IATEFL conferences.

As an alternative to the way IATEFL organises its conferences, I recommend that you look at the way ELTjam and Oxford TEFL organised their two Innovate ELT annual conferences in Barcelona. No plenaries; no good rooms, bad rooms; no grace and favour crap; nobody get’s paid for presenting. There’s a focus on issues that affect teachers’ lives; a genuine attempt to involve every single person who attends the conference, with no special attention to well-known names; an innovative mix of presentation formats; a marvellous range of social activities. I can honestly say that I’ve never attended any conferences with better content, and nothing, but nothing, compares to the wonderful cooperative, friendly, uplifting atmosphere that they managed to create. Of course there are ways that this great initiative can be improved, but the Innovate ELT conference shows the way forward, and it shows that there’s hope for those of us who want change.

Materials Evaluation


Here’s a vocabulary exercise I found while browsing through material that Gerry Sweeny, a one time colleague at ESADE Idiomas, gave me.

Vocabulary in Context

The following sentences contain nonsense words. Can you make sense of them?

  1. The sentence was written on a piece of drurb.
  2. Most drurb, like snow, is osgrave.
  3. Cats are domestic ningles.
  4. Polar bears, which are osgrave ningles, live where there is cridlington.
  5. If you set fire to drurb, it firtles.
  6. If you pour narg on firtling drurb, the flames go out.
  7. If you put cridlington into hot narg, it frumes.
  8. Cridlington frumes at a bazoota over 0º C.
  9. Narg boobles at a bazoota of 100º C .
  10. We frize bazootas with a nast.

What do you think the nonsense words mean in the above sentences?

  1. drurb
  2. osgrave
  3. ningles
  4. cridlington
  5. firtles
  6. narg
  7. frumes
  8. bazoota
  9. boobles
  10. frize
  11. nast



I’m currently looking through material available to members of the Cooperativa de Serveis Linguistics de Barcelona, with the idea of getting a materials bank together which would help members to avoid using coursebooks.  While there’s an ambundance of ELT materials available online, it’s difficult to quickly find material that satisfies a few basic criteria, such as relevance, quality, useability and legality. Neill McMillan and I met recently and we reckon that we need to assemble a lot of material which satisfies these criteria, or rather, well-considered criteria that we can all agree on, and then classify them according to fields such as, off the top of my head, level, topic, media, grammar point, and skill. The idea is to give members access to a data base of materials where they can find written and spoken texts, with accompanying worksheets, at a certain level, topic, etc., so that they can easily confect everything from an ESP course with appropriate tasks, to lesson plans, to fillers. Maybe you’re only looking for a text; maybe you’re looking for a text plus worksheet, maybe you’re looking for a fresh aproach to practicing a function; maybe you need a good clear explanation of some grammar point, maybe you’re trying to get together a proposal for a 50 hour course aimed at auditors, and so on.  I should add that I have a particular interest in developing a process syllabus, which I’ve discussed in a previous post and which relies on a materials bank.


So we see the challenges of this project as being to decide on the criteria for any bit of material, to decide on how the collection of the individual bits of material is organised in the data base, and to indicate links among them.

Looking at the worksheet above, what to do? Supposing that it were well presented, and that there were no copyright issues, does it warrant inclusion? Is its openness a good thing (allowing teachers to exploit it in their own way), or does it need some lead in and some further work? Is it useful, anyway?  More generally, how do we judge it’s worth? If you look at most of the literature on materials evaluation, you’ll be hard put to apply the frameworks to this, because most frameworks are, either explicitly or implicitly geared to coursebooks. Rather than indulge in a rant, I invite you to give your opinion. If you were getting a materials bank together, would you include this?

IATEFL 2016 Plenary. Scott Thornbury: The Entertainer


So, without more ado, ladies and gentlemen, please put your hands most forcefully together and give it up for the one, the only, the inimitable, the ever-so wonderful ……………… Scott Thornbury!!

And on he walks.

He looks good; he looks fit, well turned out, up for it. Rather than hide behind the lectern and read from a script, he roams the whole expanse of the colossal stage with practised ease, expertly addressing different sections of the huge auditorium , bringing everybody into the warm glow. He starts brilliantly. He puts the years of important signposts of his life on the screen:

  • 1950
  • 1975
  • 1997
  • 2004

and asks for suggestions as to what happened to him in those years.

“Uh oh! There’s “an element” in here today”, he says in response to a group on the right of the hall that’s having fun calling out the wrong answers to his elicitations.

His voice is warm, fruity, well-modulated, and it comes across perfectly, helped by a good PA system and by the fact that the enormous hall is packed with people. Of the IATEFL conference talks I saw on line, there was something near gender equality as far as quality of presentation is concerned, but nobody else reached Scott’s standard. John Faneslow used to be able to put him in the shade, and Michael Hoey on a good day came close, but these days, Scott’s unrivalled: he’s The Entertainer.

And it’s not just the way he performs of course – the best stand up artist depends on his or her material, right? Scott’s plenary had some very good material, and, what’s more, the content was both coherent and cohesive. Scott led us through 50 years of ELT history pointing out that really there’s nothing new under the sun; that we made lots of mistakes, that some “methods” look really weird today, while others that we think of as new were already there in the 60s, and so on.

Having arrived in his history of ELT at 1975, Scott highlighted the publication of the Strategies series of courseboooks, which he describes as “revolutionary”, since they were the first pedagogical material to be based not on grammatical structures but rather, on functions; and the first to be based not on what the language is, but rather on what you do with it.  At this point in the history, Scott came to the main part of his argument.

Two Kinds of Discourse

He suggests that two “intertwining but not interconnecting” discourses can be detected. On the one hand, there’s the “old view” that informs the various methodologies associated with grammar-based teaching. On the other, there’s the “new discourse”, which comes from a functional approach to language  and a more sociolinguistic view of language learning

In the figure below, the “old” view is on the left, and the “new” view is on right. From the top, the categories are:

  • the nature of language
  • units of language acquisition
  • the nature of learning
  • learning path
  • goals.


Scott suggests that the “Strategies” series of coursebooks resolves the argument between these 2 views in favour of the view on the right. Obviously, Scott likes the “new” view, so he was excited when the Strategies series was published – he felt he was at the dawn of a new age of ELT. But, Scott goes on to say, the matter wasn’t in fact resolved: current ELT practice has reverted to reflect the old view. Today, a grammar-based syllabus is used extensively in the global ELT industry.

So, what happened? Why didn’t things change? Why did the old discourse win out? A particularly important question is: Why does the grammar-based syllabus still reign despite clear findings from SLA research? Scott pointed out that SLA research suggests that teachers can’t affect the route of L2 development in any significant way: the inbuilt syllabus triumphs. Grammatical syllabuses fly in the face of the results of SLA research.

Scott showed results from a survey he did of more than 1,000 teachers, which showed that most teachers say they use a grammar based syllabus because students want it. In a way, they blame the students for an approach they say they’re not entirely happy with.

Despairing of finding a solution inside the ELT world, Scott thought maybe he should look at general education. But, when he took a look, he discovered that things in general education are “terrible”. Everywhere knowledge is being broken down into tiny little bits which can then be tested.  He comments: “There’s something really unhealthy in main stream education and it’s exacerbated by a discourse that’s all about McNuggets again.”

Scott then quoted Lin (2013)

“Language teaching is increasingly prepackaged and delivered as if it were a standardised, marketable product…”

“This commodifying ideology of language teaching and learning has gradually penetrated into school practices, turning teachers into ‘service providers’.”

So what’s the solution, then? Determined not to end on such a pessimistic note, Scott suggested three endings:

  1. The pragmatic route
  2. The dogmatic route
  3. The dialectic route

The Pragmatic Route says: Accept things the way they are and get on with it.

The Dogmatic (or Dogmetic!) Route says: Get rid of the coursebook, use communicative activities, and shape the language which emerges from genuine attempts at communication. Unfortunately, Scott said, this will never be really popular; at most it will be a footnote in Richards and Rogers. A more extreme route says get rid of the teacher. This isn’t an entirely silly suggestion, but again, it’s unlikely to be widely adopted.

The dialectic route tries, as in the Hegelian model, to overcome the limitations of the thesis and its antithesis by meshing the best from both. Here Scott gave two examples:

  • Language in The Wild. Used in Scandinavia. Students do classes but they’re sent out into the real world to do things like shopping.
  • The Hands Up Project.  Children who can’t get out of the classroom, such as children trapped in Gaza, are taught English by using technology to drive a communicative language learning approach.

The video of Nick in the UK interacting with some lovely kids in Gaza made a very uplifting ending to the talk.


I have two criticisms of Scott’s argument, one minor, one more important:

  1. The presentation of the two “intertwining but not interconnecting discourses” doesn’t do a good job of summarising differences between grammar-based ELT and a version of communicative language teaching that emphases interaction, student-centred learning, task-based activities, locally-produced materials, and communication for meaningful purposes.
  2. Scott’s framing of and solution to the problem of the grammar based syllabus is a cop out.

As to the first problem, Scott’s summary of the old and new, intertwined but not interconnected discourses has its limitations. The first three categories are not well-labelled, in my opinion. Language is not cognitive or social: the differences between grammatical and functional descriptions of language, or between cognitive and sociolinguistic approaches to SLA, are hardly well captured in this diagram.

Then, what are “units of acquisition”? How does the contrast between grammar Mcnuggets and communicative routines explain different conceptualisations of these “units”? What does “the nature of learning” refer to? What do “atomistic” and “holistic” mean here?  And while the fourth and fifth labels are clear enough, they’re false dichotomies; grammar-based teaching was and is concerned with promoting fluency, and communicative competence.

I think it would have been better to have used a framework like Breen’s (1984) to compare and contrast the syllabus types under scrutiny, asking of each one

  1. What knowledge does it focus on and prioritise?
  2. What capabilities does it focus on and prioritise?
  3. On what basis does it divide and sub-divide what is to be learned?
  4. How does it sequence what is to be learned?
  5. What is its rationale?

That way Scott could have looked at a grammar-based, or structural syllabus, a functional syllabus, like the one effectuated in Strategies, and a CLT syllabus as enacted in Dogme. That way, he could have dealt with the serious limitations of the Strategies approach and he could have dealt properly with his own approach. Which brings me to the more important criticism.

Face The Problem

The problem ELT faces is not “How do we resolve the tensions between two different discourses?”; rather it’s the problem which Scott clearly stated and then adroitly side-stepped on his way to a typically more anodyne, less controversial, resolution. The real problem is:

How can we combat the commodifying ideology of language teaching and learning which has turned teachers into ‘service providers’ who use coursebooks to deliver language instruction as if it were a standardised, marketable product?  

And the solution, of course, is radical change.

Decentralise. Organise teaching locally. Get rid of the coursebook. Reform the big testing authorities. Reform CELTA. Etc., etc..

Why did Scott side-step all these issues? Why, having clearly endorsed the findings of SLA research which show up the futility of a grammar based syllabus, and having shown how “really unhealthy” current ELT practice is, did Scott not argue the case for Dogme, or for Long’s version of TBLT, or for a learner-centred approach? Why did he not argue for reform of the current tests that dominate ELT, or of CELTA ?  Why did Scott dismiss his own approach, Dogme, as deserving no more than a footnote in Richards and Rogers, instead of promoting it as a viable alternative to the syllabus type that he so roundly, and rightly criticised?

Maybe, as he said, it was the end of the conference and he didn’t want to be gloomy. Or maybe it’s because he’s The Entertainer and that part of him got the better of the critical thinker and the reformer in him. If so, it’s a darn shame, however much fun it was to watch the performance.


Breen, M.P. (1984) Process syllabuses for the language classroom. In C.J.Brumfit (Ed.).  General English Syllabus Design. ELT Documents No. 118. London: Pergamon Press & The British Council. 47-60.

Lin, A. 2013. Toward paradigmatic change in TESOL methodologies: building plurilingual pedagogies from the ground up, TESOL Quarterly, 47/3.

Can we get a pineapple?


Lost and Unfounded

Leo Selivan’s and Hugh Dellar’s recent contributions to EFL Magazine give further evidence that their strident, confidently expressed ideas lack any proper theoretical foundations.

We can compare the cumulative attempts of Selivan and Dellar to articulate their versions of the lexical approach with the more successful attempts made by Richards and Long to articulate their approaches to ELT.  Richards (2006) describes what he calls “the current phase” of communicative language teaching as

a set of principles about the goals of language teaching, how learners learn a language, the kinds of classroom activities that best facilitate learning, and the roles of teachers and learners in the classroom ( Richards, 2006:2)

Note that Richards says this on page 2 of his book: he rightly starts out with the assumption that “a set of principles” is required.

Long (2015) offers his own version of task based language teaching and he goes to great lengths to explain the underpinnings of his approach. His book is, in my opinion, the best example in the literature of a well-founded, well-explained approach to ELT. It’s based on a splendidly lucid account of a cognitive-interactionist theory of instructed SLA, on careful definitions of task and needs analysis, and on 10 crystal clear methodological principles. Long’s book is to be recommended for its scholarship, its thoroughness, and, not least, for its commitment to a progressive approach to ELT.

So what do Selivan and Dellar offer?

In his “Beginners’ Guide To Teaching Lexically”, Selivan makes a number of exaggerated generalisations about English and then outlines “the main principles of the lexical approach”. These turn out to be

  1. Ban Single Words
  2. English word ≠ L1 word
  3. Explain less – explore more
  4. Pay attention to what students (think they) know.

To explain how such “principles” adequately capture the essence of the lexical approach, Sellivan offers “A bit of theory”for each one. For example, Selivan says “A new theory of language, known as Lexical Priming, lends further support to the Lexical Approach.  ……. By drawing students’ attention to collocations and common word patterns we can accelerate their priming”. Says he. But what reasons does he have for such confident assertions? Selivan fails to give his reasons, and fails to give any proper rationale for the claims he makes about language and teaching.

In his podcast, Dellar agrees that collocation is the driving force of English. He claims that the best way to conduct ELT is to concentrate on presenting and practising the lexical chunks needed for different communicative events. Teachers should get students to do things with these chunks such as “fill in gaps, discuss them, order them, say them, write them out themselves, etc.” with the goal of getting students to memorize them. Again, Dellar doesn’t explain why we should concentrate on these chunks, or why teachers should get students to  memorise them. Maybe he thinks “It stands to reason, yeah?”

At one point in his podcast Dellar says that, while those just starting to learn English will go into a shop and say “I want, um, coffee, um sandwich”,

…. as your language becomes more sophisticated, more developed, you learn to kind of grammar the basic content words that you’re adding thereSo you learn “Hi. Can I get a cup of coffee and a sandwich, please.” So you add the grammar to the words that drive the communication, yeah? Or you just learn that as whole chunk. You just learn “Hi. Can I get a cup of coffee? Can I get a sandwich, please?” Or you learn “Can I get…” and you drop in a variety of different things.

This is classic “Dellarspeak”: a badly-expressed misrepresentation of someone else’s erroneous theory.  Dellar doesn’t tell us how we teach learners “to grammar” content words, or when it’s better to teach “the whole chunk” – or what informs his use of nouns as verbs, for that matter. As for the “can I get…?” example, what’s wrong with just politely naming what we want:  Good MorningA coffee and a sandwich, please.”?  What is gained by teaching learners to use the redundant Can I get…. phrase?

But enough of Dellar’s hapless attempts to express other people’s ideas, let’s cut to the chase, if you get my drift. The question I want to briefly discuss is this:

Are Selivan’s and Dellar’s claims based on coherent theories of language and language learning, or are they mere opinions?


Models of English 

Crystal (2003) says: “an essential step in the study of a language is to model it”. Here are two models:

  1. A classic grammar model of the English language attempts to capture its structure, described in terms of grammar, the lexicon and phonology (see Quirk 1985, and Swan, 2001, for examples of descriptive and pedagogical grammars). This grammar model, widely used in ELT today, is rejected by Hoey.
  2. Hoey (2005) says that the best model of language structure is the word, along with its collocational and colligational properties. Collocation and “nesting” (words join with other primed words to form sequence) are linked to contexts and co-texts. So grammar is replaced by a network of chunks of words. There are no rules of grammar; there’s no English outside a description of the patterns we observe among those who use it. There is no right or wrong in language. It makes little sense to talk of something being ungrammatical (Hoey, 2005).

Selivan and Dellar uncritically accept Hoey’s radical new theory of language, but is it really better than the model suggested by grammarians?

Surely we need to describe language not just in terms of the performed but also in terms of the possible. Hoey’s argument that we should look only at attested behaviour and abandon descriptions of syntax strikes most of us as a step too far. And I think Selivan and Dellar agree, since they both routinely refer to the grammatical aspects of language. The problem is that Selivan and Dellar fail to give their own model of language, they fail to clearly indicate the limits of their adherence to Hoey’s model, they fail to say what place syntax has in their view of language. In brief, they have no coherent theory of language.

Hoey’s Lexical Priming Theory

Hoey (2005) claims that we learn languages by subconsciously noticing everything (sic) that we have ever heard or read about words, and storing it all in a massively repetitious way.

The process of subconsciously noticing is referred to as lexical priming. … Without realizing what we are doing, we all reproduce in our own speech and writing the language we have heard or read before. We use the words and phrases in the contexts in which we have heard them used, with the meanings we have subconsciously identified as belonging to them and employing the same grammar. The things we say are subconsciously influenced by what everyone has previously said to us.

This theory hinges on the construct of “subconscious noticing”, but instead of explaining it, Hoey simply asserts that language learning is the result of repeated exposure to patterns of text (the more the repetition the better the knowledge), thus adopting a crude version of behaviourism. Actually, several on-going quasi-behaviourist theories of SLA try to explain the SLA process (see, for example, MacWhinney, 2002; O’Grady, 2005; Ellis, 2006; Larsen-Freeman and Cameron, 2008), but Hoey pays them little heed, and neither do Selivan and Dellar, who swallow Hoey’s fishy tale hook line and sinker, take the problematic construct of priming at face value, and happily uses “L1 primings” to explain L1 transfer as if L1 primings were as real as the nose on Hoey’s face.

Hoey rejects cognitive theories of SLA which see second language learning as a process of interlanguage development, involving the successive restructuring of learners’ mental representation of the L2, because syntax plays an important role in them. He also rejects them because, contrary to his own theory, they assume that there are limitations in our ability to store and process information. In cognitive theories of SLA, a lot of research is dedicated to understanding how relatively scarce resources are used. Basically, linguistic skills are posited to slowly become automatic through participation in meaningful communication. While initial learning involves controlled processes requiring a lot of attention and time, with practice the linguistic skill requires less attention and less time, thus freeing up the controlled processes for application to new linguistic skills. To explain this process, the theory uses constructs such as comprehensible input, working and long term memory, implicit and explicit learning, noticing, intake and output.

In contrast, Hoey’s theory concentrates almost exclusively on input, passing quickly over the rest of the issues, and simply asserts that we remember the stuff that we’ve most frequently encountered. So we must ask Selivan and Dellar: What theory of SLA informs your claims? As an example, we may note that Long (2015) explains how his particular task-based approach to ELT is based on a cognitive theory of SLA and on the results of more than 100 studies.

Hoey’s theory doesn’t explain how L2 learners process and retrieve their knowledge of L2 words, or how paying attention to lexical chunks or “L1 primings” affects the SLA process. So what makes Selivan and Dellar think that getting students to consciously notice both lexical chunks and “L1 primings” will speed up primings in the L2? Priming, after all, is a subconscious affair. And what makes Dellar think that memorising lexical chunks is a good way to learn a second language? Common sense? A surface reading of cherry-picked bits of contradictory theories of SLA? Personal experience? Anecdotal evidence? What? There’s no proper theoretical base for any of Dellar’s claims; there’s scarce evidence to support them; and there’s a powerful theory supported by lots of evidence which suggests that they’re mistaken.


 All Chunks and no Pineapple 

Skehan (1998) says:

Phrasebook-type learning without the acquisition of syntax is ultimately impoverished: all chunks but no pineapple. It makes sense, then, for learners to keep their options open and to move between the two systems and not to develop one at the expense of the other. The need is to create a balance between rule-based performance and memory-based performance, in such a way that the latter does not predominate over the former and cause fossilization.

If Selivan and Dellar agree that there’s a need for a balance between rule-based performance and memory-based performance, then they have to accept that Hoey is wrong, and confront the contradictions that plague their present position on the lexical approach, especially their reliance on Hoey’s description of language and on the construct of priming. Until Selivan and Dellar sort themselves out, until they tackle basic questions about a model of English and a theory of second language learning, so as to offer some principled foundation for their lexical approach, then it amounts to little more than an opinion, more precisely: the unappetising opinion that ELT should give priority to helping learners memorise pre-selected lists of lexical chunks. 


Crystal, D. (2003) The English Language. Cambridge: Cambridge University Press.

Ellis, N. C. (2006) Language acquisition and rational contingency learning. Applied Linguistics, 27 (1), 1-24.

Hoey, M. (2005) Lexical Priming: A New Theory of Words and Language. Psychology Press.

Krashen, S. (1985) The Input Hypothesis: Issues and Implications. Longman.

Larsen-Freeman, D and Cameron, L. (2008) Complex Systems and Applied Linguistics. Oxford, Oxford University Press.

Lewis, M. (1993) The Lexical Approach. Language Teaching Publications.

Lewis, M. (1996) Implications of a lexical view of language’. In Willis, J,, & Willis, D. (eds.) Challenge and Change in Language Teaching, pp. 4-9. Heinemann.

Lewis, M. (1997) Implementing the Lexical Approach. Language Teaching Publications.

Long, M. (2015) Second Language Acquisition and Task-Based Language Teaching. Wiley.

MacWhinney, B. (2002) The Competition Model: the Input, the Context, and the Brain. Carnegie Mellon University.

O’Grady, W. (2005) How Children Learn Language Cambridge, Cambridge Universiy Press.

Richards, J (2006) Communicative Language Teaching Today. Cambridge University Press.

Quirk, R., Greenbaum, S., Leech, G. and Svartvik, J. (1985) A Comprehensive Grammar of the English Language, London: Longman.

Skehan, P. (1998) A Cognitive Approach to Language Learning. Oxford: Oxford University Press.

Swan, M. (2001) Practical English usage. Oxford: Oxford University Press.

Do It Like Dellar, Or Use Your Brain?


In his talk Teaching Grammar Lexically, Dellar tells us about the life-changing effects that reading Michael Lewis’ The Lexical Approach had on him. What it did was to jolt him out of his comfortable life of grammar-based PPP teaching, and make him realise that language was not lexicalised grammar, but rather grammaticalised lexis. This “profound shift in perspective” took its toll; Dellar confesses that struggling with the challenging implications of Lewis’ text threw his teaching into a state of chaos for two years, and that he and his co-author Andrew Walkley have spent more than twenty years “unpicking” its “dense” content.  About ten years later, Dellar had sufficiently recovered from his intellectual odyssey to read another book, Hoey’s Lexical Priming, and this led to an even deeper understanding of, and commitment to, the lexical approach. The extent of Hoey’s influence on Dellar can be appreciated by noting that, after 2005, Dellar’s stock of constructs doubled – from one to two, so that it now consists of “lexical chunks” and “priming”.  Undaunted by widespread criticism of Lewis’ and Hoey’s arguments (neither offers a developed theory of SLA or a principled methodology for ELT), and unencumbered by complicated theorising or thinking critically, Dellar sees ELT with uncluttered clarity. Alas, he also sees the need to share his vision with others, and so he travels around the world extolling teachers everywhere to profoundly shift their perspective, throw off the “tyranny of PPP” and embrace the promise of properly primed lexical chunks.

In contrast to this simplistic, evangelical proselytising, real educators take the view that the primary goal of education is to encourage people to question everything, to think critically for themselves. It emphasises the dance of ideas, the delight in thinking about things in such a way that one’s intellect is engaged, one’s appreciation of the complexity of things is improved, and the accumulation of information is down-played. In primary and secondary schools, the good teachers are those who encourage students to question conventional wisdom and at university, the same ethos of critical thinking is what informs the best academic staff; they’re less concerned with facts than with what their students make of them. My concern is that in the world of ELT training, this type of approach is not much in evidence.

What is critical thinking?

The ability to think critically involves three things:

  1. An attitude: don’t believe what you’re told.
  2. Knowledge of the methods of logical inquiry and reasoning
  3. Skill in applying the methods referred to in 2 above.

Critical thinking demands a persistent effort to examine anything you’re told in the light of the evidence that supports it and the logic of its conclusions. It demands that you’re not impressed by who said it, that you remain open-minded, and, above all, that you think rationally for yourself. It refers to your ability to interpret data, to appraise evidence and evaluate arguments. it refers, that is, to your ability to critically examine the so-called facts, to assess the existence (or non-existence) of logical relationships between propositions and to assess whether conclusions are warranted.

Critical thinking is needed when you do your own work and when you assess the work of others. When you do your own work in an MA paper, critical thinking demands that you

  • articulate the problems addressed
  • look for means for solving those problems
  • gather and marshal pertinent information
  • evaluate the information gathered.

When you evaluate your own work and the work of others you must identify defects such as

  • poor articulation of the problem
  • appeals to authority
  • unstated assumptions and values
  • partial data
  • unwarranted conclusions

Thinking critically is, in my opinion, most of all an attitude.  It’s the attitude of  a sceptic, of one who can sniff a rat. In many areas of life you’ll be well-advised to deliberately ignore the scent, but when it comes to matters academic, sniffing a rat, sensing that there’s something wrong in the argumentation and evidence given in a text, is a skill that needs nurturing and honing. Once you adopt the right attitude, then you have to improve your ability to not just feel there’s something wrong, but to identify exactly what that something is.


Back to the Baloney: Dellar’s Lexical Approach  

From the evidence available on his websites, blogs, and recorded interviews, webinars and presentations, Dellar’s approach to ELT training is severely at odds with the critical approach I’ve sketched above. Despite his confusion about both theories of language (UG = Structuralism??) and theories of SLA, Dellar presumes to tell teachers what’s wrong and what’s right. PPP grammar teaching is wrong, teaching lexical chunks is right. Rather than attempt to evaluate different approaches to ELT and tentatively recommend this or that alternative, Dellar banishes doubt and gives the strong impression that he’s cracked it: he’s worked out the definitive blueprint of ELT, and all he has to do is to overcome the entrenched resistance of those still chained to Headway, or English File, or whatever coursebook it is that keeps them languishing under the oppression of PPP grammar teaching.

Any initial excitement  you might feel about someone proposing a move away from coursebook-led teaching soon evaporates when you realize that Dellar is against grammar-based coursebooks, but not coursebooks per se; indeed the definitive blueprint of ELT turns out to be nothing other than his own coursebook series Outcomes! Delivery from the tyranny of grammar teaching and emergence into the brave new world of the lexical approach is a simple affair: you just throw away Headway and pick up Outcomes and  then lead your students, unit by unit. through its mind-numbingly boring pages, just as with any other coursebook. The only difference between the tyrannical past and the liberated future is that grammar boxes are replaced with long lists of leaden lexical chunks, repeated exposure to which is somehow supposed to lead to communicative competence.

If we examine Dellar’s published work – his blogs, his video presentations, his webinars, his conference presentations, and his coursebooks – we find very little evidence of critical thinking about language, language learning or language teaching.

  • Does his work invite teachers to critically consider different views of language? Does it consider the arguments for a generative grammar as argued by Chomsky, versus the arguments for a structuralist approach as argued by Bloomfield, or a functional approach, as argued by Halliday, or a functional-notional approach as argued by Wilkims, or a lexical approach as argued by Pawley, or Nattinger, or Byber? Or does it tell them that English is best seen as lexically-driven, and that’s that?
  • Does his work invite teachers to consider the pros and cons of different accounts of SLA, of different weights given to input and output, of explicit and implicit learning, of different accounts of interlanguage development? Or does it tell them that priming is the key to SLA?  Does Dellar’s oeuvre encourage teachers to critically assess the construct of priming? Or is priming taken as a given, and is it simply asserted that lexical chunks are the secret of language learning?


ELT training is too often characterised by an “I know best” assumption, and by its general rejection of a critical approach to education. Instead of approaching teacher training sessions with prepared answers already in hand, teacher trainers should adopt a critical thinking approach to their job. They should ask teachers open questions, toss some provisional answers out for discussion, invite teachers to critically evaluate them, and work with teachers to help them come up with their own tentative solutions. I need hardly add that these tentative solutions should then be critically discussed.

Criticising Harmer


My criticisms of Jeremy Harmer’s latest published work have caused some dismay, which was only to be expected. Equally predictable was that so few of those who objected to what I said, or how I said it, voiced their concerns; silence, as usual, was the preferred response. To those who did speak up, either in emails to me, or in other forums, here’s my reply.

First, a summary of my criticisms:

  • Harmer’s latest edition of The Practice of Language Teaching is badly written, badly informed, and displays a lack of critical acumen.
  •  Harmer’s pronouncements on testing in 2015 were appalling.
  • Harmer is an obstacle to progress in ELT.

Well, that’s my view, and I’ve given some evidence to support it in various posts. Further evidence can be got by simply reading his book and watching his presentations. I’ll be glad to talk to Harmer face to face in any forum that he or anyone else wants to organise. Anytime, anywhere.

I take criticism here to be the act of analysing and evaluating the quality of a given text. This involves deconstructing it. I use “deconstruct” as Gill in the quote that heads this blog uses it (not in the special sense that Derrida uses it), to refer to a process that’s been used down through the ages: to deconstruct a text is to critically take it apart. What we examine is the coherence and cohesion of the text, its expression and its content.

At the most superficial (I mean “surface”, not unimportant) level, the quality of a text can be judged by its coherence and cohesion. Coherence refers to clarity, while cohesion refers to organisation and flow. Harmer’s texts lacks both.  Pick up Harmer’s magnum opus, the truly appalling Practice of English Language Teaching, start reading, and ask yourself: Is this clear? Is this well-expressed? Does the text flow?

  • How many sentences are ungrammatical?
  • How often could things have been more succinctly expressed?
  • How often do you struggle to get to the end of a sentence?
  • How often does the text meander?
  • How often do you feel that the writing is tedious?
  • How often are you referred elsewhere?

The coherence of the text is severely weakened by its author’s inability to stick to the point and to express himself clearly: so often a simple point is dragged out for pages. As for cohesion, the text looks well-organised, but it fails to properly sequence its arguments. It’s full of references to other places in the text where what’s being dealt with is dealt with differently, so you never quite get a handle on anything. And, crucially for cohesion, there’s no over-arching argument running through the text: it’s a motley collection of bits and pieces.

At a deeper level of criticism, we should ask questions about content.

  • Does the text show a good command of things discussed?
  • Does it present an up to date summary of ELT?
  • Does it give a fair and accurate description of current views of the English language, of L2 language learning, of teaching, and of assessment ?
  • Is there the slightest hint of originality?
  • Does it give a good critical evaluation of matters discussed?
  • Is it enjoyable to read?

A critical view of the text demands that we don’t take anything for granted. No assertions should be taken at face value; we should carefully scrutinise any opinions, and we should give some attention to the kind of critical discourse analysis (CDA) proposed by Fairclough and others where political issues are weighed. If we critically examine Harmer’s Practice of Language Teaching in this way, I suggest that we’ll conclude that the answer to the 6 questions above is a resounding “No!”, and that a CDA of the book reveals a deeply conservative commitment to the status quo.

Thoughts of The Master


Those who are about to embark on the discourse analysis bit of their MA course might like to examine the quotes below. They’re all quotes from the published work of Jeremy Harmer. As you know, the purpose of discourse analysis is to examine texts “beyond the sentence boundary”. Various frameworks can be used, but I recommend a literacy approach here, where you concentrate on the verbosity, bathos, and general pumped-up, faux academic prose of the writer, blissfully unaware of his limitless limitations. Note the cascade of clichés, the resort to tired truisms, the bumbling use of brackets, and the general tedium of the text, not alleviated by random bits of bullshit. The final example in the list below refers you to a video recording on Harmer’s blog where you’ll find the master examining the finer points of testing in his own unique manner.

So take a look below. As they say in MacDonald’s when they bring you your tasteless, lack-lustre, nutrition-free meal: Enjoy!

The constant interplay of applied linguistic theory and observed classroom practice attempts to draw us ever closer to a real understanding of exactly how languages are learnt and acquired, so that the work of writers such as Ellis (1994) and Thornbury (1999)—to mix levels of theory and practice—are written to influence the methodology we bring to language learning. We ignore their challenges and suggestions at our peril, even if due consideration leads us to reject some of what they tell us.

Teaching may be a visceral art, but unless it is informed by ideas it is considerably less than it might be.

Without beliefs and enthusiasms, teachers become client-satisfiers only—and that is a model which comes out of a different tradition from that of education, and one that we follow at our peril.

A problem with the idea that methodology should be put back into second place (at the very most) is that it threatens to damage an essential element of a teacher’s make-up—namely what they believe in, and what they think they are doing as teachers.

A belief in the essentially humanistic and communicative nature of  language  may well  pre-dispose certain teachers towards a belief in group participation and learner input rather than relying only on the straightforward transmission of knowledge from instructor to passive instructee.

One school of thought which is widely accepted by many language teachers is that the development of our conceptual understanding and cognitave skills is a main objective of all education. Indeed, this is more important than the acquisition of factual information (Williams and Burden 1997:165).

Any teacher with experience knows that it is one thing to put educational temptation in a child’s way (or an adult’s); quite another for that student to actually be tempted.

There is nothing wrong (and everything right) with discovery-based experiential learning. It just doesn’t work some of the time.

What precisely is the role of a cloud granny and how can she (or perhaps he) make the whole experience more productive.

Yet without our accumulated knowledge and memories what are we? Our knowledge is, on the contrary, the seat of our intuition and our creativity. Furthermore, the gathering of that knowledge from our peers and, crucially, our elders and more experienced mentors is part of the process of socialization. Humanity has thought this to be self-evident for at least 2000 years.

On testing

As you watch the master deliver his polished address:

  • Note the setting: the well-appointed sitting room, the unused, high quality microphone, the classical music in the background.
  • Note the speaker: the pose, the homely mug of tea, the air of quiet confidence, the carefully-practiced delivery.
  • Note, too, the complete lack of content in what he says, the utter disregard for any serious engagement with an important issue, the assumption that this indulgent, look-at-me-farting-around-saying-absolutely-nothing display will be well met.
  • Such, you might think, is the arrogance of power.


Harmer: The Practice of English Language Teaching 5th Edition


The new edition of Harmer’s Practice of English Language Teaching is over 500 pages and includes chapters on:

  • English as a world language
  • Theories of language and language learning
  • Learner characteristics which influence teacher decisions
  • Guidance on managing learning
  • Teaching language systems (grammar, vocabulary and pronunciation)
  • Teaching language skills (speaking, writing, listening and reading)
  • Practical teaching ideas
  • The role of technology (old and new) in the classroom
  • Assessment for language learning in the digital age

If you’re doing a course in ELT, then reading the new edition of Harmer’s massive tome might well have the salutary effect of making you re-consider your career choice. Nobody could blame you if, having read this mind numbingly tedious book, you decided to quit ELT and apply for a job in the Damascus Tourist Agency.  In the unlikely event that you reach the end of its 550 pages, you’ll probably have lost the will to live, let alone teach. Each page is weighed down by badly crafted, appallingly dull writing; each chapter says nothing new or succinct about its subject; each section says nothing that you can’t find much better treatments of in other, well-focused books.

The section on English as a world Language is absurdly long, badly considered and leans heavily on Crystal, who does a much better job of it, far more concisely and completely in his  book The English Language. The section on theories of  language learning is disgraceful; not one of the theories mentioned is properly stated or discussed. I really can’t bring myself to go through the rest of the book; it’s consistantly badly informed, badly considered, wordy and unhelpful.

It’s the style that offends me most in this horrendously-long, door-jam of a book; despite the efforts of all his editors, the suffocating effect of Harmer’s faux academic, charmlessly chummy, verbose and ineffectual prose is to turn everything to sludge. The reader wades endlessly through the sludge, unaided even by decent signposts, towards another badly defined horizon, there to meet more of the same: another, different hill to climb.

Even if you can get over the soporific effects of Harmer’s writing, the content is not likely to satisfy you, whatever TESOL qualification you’re aiming at. The audacious sweep of the book is almost ironic: here’s a book where everything is mentioned and nothing is adequately dealt with. Magpies skillfully take what they need from other nests; Harmer haplessly crashes into the work of scholars, conveying almost nothing of their contribution. Anything, but anything, mentioned here needs further reading. Needless to say, the bibliography is hopeless.

Just to round it off, the seemingly endless trudge through Harmer’s wasteland gets the reader precisely nowhere.  No final vision awaits; all you get at the end of this pathetic pilgrimage are poorly considered, unoriginal platitudes.

This dreadful book serves as a mirror for everybody involved in ELT. How can we in the ELT world be taken seriously by other areas of education when such a book is recommended reading in so many teacher training courses, and even in post-graduate courses?

Harmer, J. (2015) The Practice of English Language Teaching . 5th Edition. London, Pearson.

A Final Tilt at the Windmill of Thornbury’s A to Z


They keep coming, like burps after a poorly-digested Christmas lunch: comments on Thornbury’ A to Z blog. I’ve read 3 in the last few days, so let me add my own final swipe at the edifice before 2015 concludes.

Thornbury’s Sunday posts on his A to Z blog only lasted a few months, but during that short season they became part of my Sunday morning routine: late breakfast, read Thornbury, join in the discussions that aways followed. The final Sunday post was The Poverty of the Stimulus, and as usual it had enough good stuff in it to spark off an interesting discussion. On this particular Sunday  I made a few contributions and the exchange went something like this:

Initial statement from Scott (I use his first name to emphasise the cosy Sunday morning feel of the discussion, and also as a way of reminding myself to be nice.)

  1. The quantity and quality of language input that children get is so great as to question Chomsky’s poverty of the stimulus argument.
  2. An alternative to Chomsky’s view of language and language learning, is that “language is acquired, stored and used as meaningful constructions (or ‘syntax-semantics mappings’).”
  3. Everett (he of “There is no such thing as UG” is right to point out that since no one has proved that the poverty of the stimulus argument is correct, “talk of a universal grammar or language instinct is no more than speculation”.


My first reply is short:

“Everett’s claim is nonsense since it’s logically impossible to prove that a theory is true.”

Scott ignores this comment and prefers to pay attention to a certain Svetlana (I imagine her sitting in a wifi-equipped tent, huddled over an Apple app projecting a 3-D crystal ball) who tells him that he’s right to question the POS claim because tiny babies, only recently emerged (sic) from the womb, form huge numbers, like, well millions, of neural connections per second and what’s more, they rapidly develop dendritic spines containing “lifelong memories”.  A few unsupported pseudo-scientific, quasi-philosophical assertions which sound as if they’ve been picked up from a hazy weekend seminar at the Sorbonnne are thrown in for good measure.

Imagine my surprise when Scott thanks the mystic Svetlana for bringing “new evidence to bear”, and says that this evidence serves to confirm his “initial hunch.”

“WHAT??” I typed furiously. “Are you really going to be hoodwinked by such postmodernist, obscurantist mumbo jumbo?” (There’s not much known for sure about the role dendritic spines play in learning and memory; I suspect she thinks that mentioning them here is evidence of deep knowledge of the scientific study of the nervous system; and suggesting that they disprove the POS argument is fanciful nonsense.)

“Give us an example of a lifelong memory stored in a dendritic spine that ’s relevant to this discussion then!” I shout uselessly at the monitor.

Well, Scott’s not just hoodwinked, he actually becomes emboldened. Spurred on by the compelling “new evidence”, he’s now ready to dismiss the POS argument completely.

“Actually”, he says, the stimulus is quite enough to explain everything children know about language. Corpus studies “suggest that everything a child needs is in place”.

Asked how these corpus studies explain what children know about language, Scott (apparantly still intoxicated by Svetlana’s absurd revelations) says “the child’s brain is mightily disposed to mine the input”, adding, as if this were the clincher, “a little stimulus goes a long way, especially when the child is so feverishly in need of both communicating and becoming socialized.”

“Cripes! His brain’s gone soft!” I thought. “He’s barking mad!”

“Platitudes and unsupported assertions have now completely replaced any attempt at reasoned argument”, I wrote.

“Anyone who claims that children’s knowledge about an aspect of syntax could not have been acquired from language input has to prove that it couldn’t. Otherwise it remains another empirically-empty assertion” says Scott.

Dear oh dear, here we are back at the start. As with the Everett quote, for purely formal reasons, it’s not possible to prove such a thing, and to demand such “proof” demonstrates an ignorance of logic and of how rational argument, science, and theory construction work. Failing to meet the impossible demand of proof doesn’t make the POS argument an empirically-empty assertion.

Then Russ Mayne joins in to have his typically badly-informed little say. Chomsky, he tells us, is “utterly scornful of data.”

“No he’s not”, says I, ““Chomsky’s theory of UG has a long and thorough history of empirical research.”

And blow me down if Thornbury doesn’t chime in:

““Chomsky’s theory of UG has a long and thorough history of empirical research”. What!!? Where? When? Who?”

So now he’s not just showing a prediliction for explanations involving the lifelong memories stored in dendritic spines, he’s showing even worse signs of ignorance.


That the discussion of the POS argument didn’t get satisfactorily resolved is hardly surprising, but I was more than a bit surprised to hear Scott telling us that language learning can be satisfactorily explained by the general learning processes going on inside feverish young brains that are “mightily disposed to mine the input”. (Just in passing, all these references to the child’s brain seem to contradict the part of the current Thornbury canon which deals with “the language body”.) Asked to say a bit more about how language learning can be done through general learning processes and input alone, Thornbury says

“If we generalize the findings beyond the single word level to constructions…” and then “… generalize from constructions to grammar…”,  “hey presto, the grammar emerges on the back of the frequent constructions.”

Hey presto? What grammar? What “findings beyond the single word level”? How do you generalise these findings to “constructions” And how do you generalise from constructions to “grammar”?

This unwarranted dismissal of the POS argument, coupled with its incoherent account of language learning is, you might think, excusable in a Sunday morning chat, but we find more evidence of both the ignorance and the incoherence displayed here in more carefully-prepared public pronouncements on the same subjects. Thornbury’s very poor attempts to challenge Chomsky and psychological approaches to SLA by offering a particularly lame and simplistic version of emergentism, mostly based on Larsen-Freeman’s recent work have already been commented on in this blog (see for example Thornbury and the Learning Body and Emergentism 2), but let me say just a bit more.

Thornbury and Emergentism

Thornbury keeps telling people about Larsen-Freeman’s latest project. The best criticism I’ve read of it is the 2010 article by Kevin Gregg in SLR entitled “Shallow draughts: Larsen-Freeman and Cameron on complexity.” There’s no way I can do justice to the article by quickly summarising it, and I urge readers of this post to read Gregg’s article for themselves. As always with Gregg, the argument is not just devastating, but delightfully written. Gregg dismantles the pretences of the Larsen-Freeman and Cameron book and shows that all their appeals to complexity theory are so much hogwash; nothing of substance sustains the fanciful opinions of the authors. And likewise, Thornbury.

Thornbury has said nothing to persuade any intelligent reader that his version of emergentism provides a good explanation of SLA. Just a few points:

  • Emergentism rests on empiricism and empiricism pure and simple is a bankrupt epistemology.
  • Emergentism doesn’t get the support Thornbury claims it gets from the study of corpora – how could it? Thornbury’s claims show an ignorance of both theory construction and scientific method.
  • As Gregg (2010) points out, the claim that language is a complex dynamical system makes no sense. “Simply put, there is no such entity as language such that it could be a system, dynamical or otherwise……. Terms like ‘language’ and ‘English’ are abstractions; abstract terms, like metaphors, are essential for normal communication and expression of ideas, but that does not mean they refer to actual entities. English speakers exist, and (I think) English grammars come to exist in the minds/brains of those speakers, so it remains within the realm of possibility that a set of speakers is a dynamical system, or that the acquisition process is; but not language, and not a language.”
  • Thornbury’s assertion that language learning can be explained as the detection and memorisation of “frequently-occurring sequences in the sensory data we are exposed to” is an opinion masquerading as an explanatory theory. How can general conceptual representations acting on stimuli from the environment explain the representational system of language that children demonstrate?  Thornbury’s  suggestion that we have an innate capacity to “unpack the regularities within lexical chunks, and to use these patterns as templates for the later development of a more systematic grammar” begs more questions than it answers and, anyway, contradicts the empiricist epistemology adopted by most emergentists who say that there aren’t, indeed can’t be, any such things as innate capacities.

NOTE: I’ve added 2 appendices to deal with the 2 questions asked by Patrick Amon.

Appendix 1: Why can’t you prove that a general causal theory is true?

The problem of induction

Hume (1748) started from the premise that only “experience” (by which Hume meant that which we perceive through our senses) can help us to judge the truth or falsity of factual sentences. Thus, if we want to understand something, we must observe the relevant quantitative, measurable data in a dispassionate way. But if knowledge rests entirely on observation, then there is no basis for our belief in natural laws because it is an unwarranted inductive inference. We cannot logically go from the particular to the general: no amount of cumulative instances can justify a generalisation; ergo no general law or generalised causal explanation is true. No matter how many times the sun rises in the East, or thunder follows lightening, or swans appear white, we will never know that the sun rises in the East, or that thunder follows lightning or that all swans are white. This is the famous “logical problem of induction”. Why, nevertheless, do all reasonable people expect, and believe that instances of which they have no experience will conform to those of which they have experience?” Hume’s answer is: ‘Because of custom or habit’. (Popper, 1979: 4)More devastating still was Hume’s answer to Descartes’ original question “How can I know whether my perceptions of the world accurately reflect reality?” Hume’s answer was “You can’t.”

It is a question of fact whether the perceptions of the senses be produced by external objects resembling them: how shall this question be determined? By experience surely; as all questions of a like nature.  But here experience is, and must be, entirely silent.  The mind has never anything present to it but the perceptions, and cannot possibly reach any experience of their connection with objects.  The supposition of such a connection is, therefore, without any foundation in reasoning. (Hume, 1988 [1748]: 253)

Thus, said Hume, Descartes was right to doubt his experiences, but, alas, experiences are all we have.

The asymmetry between truth and falsehood.

Popper (1972) offers a way out of Hume’s dilemma. He concedes that Hume is right: there is no logical way of going from the particular to the general, and that is that: however probable a theory might claim to be, it can never be claimed to be true.

Popper (1959, 1963, 1972) argued that the root of the problem of induction was the concern with certainty. In Popper’s opinion Descartes’ quest was misguided and had led to three hundred years of skewed debate.  Popper claimed that the debate between the rationalists and the empiricists, with the idealists pitching in on either side, had led everybody on a wild goose chase – the elusive wild goose being “Truth”.  From an interest in the status of human knowledge, philosophers and philosophers of science had asked which, if any, of our beliefs can be justified.  The quest was for certainty, to vanquish doubt, and to impose reason.  Popper suggested that rather than look for certainty, we should look for answers to problems, answers that stand up to rational scrutiny and empirical tests.

Popper insists that in scientific investigation we start with problems, not with empirical observations, and that we then leap to a solution of the problem we have identified – in any way we like. This second anarchic stage is crucial to an understanding of Popper’s epistemology: when we are at the stage of coming up with explanations, with theories or hypotheses, then, in a very real sense, anything goes.  Inspiration can come from lowering yourself into a bath of water, being hit on the head by an apple, or by imbibing narcotics.  It is at the next stage, the stage of the theory-building process, that empirical observation comes in, and, according to Popper, its role is not to provide data that confirm the theory, but rather to find data that test it.

Empirical observations should be carried out in attempts to falsify the theory: we should search high and low for a non-white swan, for an example of the sun rising in the West, etc. The implication is that, at this crucial stage in theory construction, the theory has to be formulated in such a way as to allow for empirical tests to be carried out: there must be, at least in principle, some empirical observation that could clash with the explanations and predictions that the theory offers.  If the theory survives repeated attempts to falsify it, then we can hold on to it tentatively, but we will never know for certain that it is true.  The bolder the theory (i.e. the more it exposes itself to testing, the more wide-ranging its consequences, the riskier it is) the better.  If the theory does not stand up to the tests, if it is falsified, then we need to re-define the problem, come up with an improved solution, a better theory, and then test it again to see if it stands up to empirical tests more successfully.  These successive cycles are an indication of the growth of knowledge.

Popper (1974: 105-106) gives the following diagram to explain his view:

P1 -> TT -> EE -> P2

P = problem   TT = tentative theory  EE = Error Elimination (empirical experiments to test the theory)

We begin with a problem (P1), which we should articulate as well as possible. We then propose a tentative theory (TT), that tries to explain the problem. We can arrive at this theory in any way we choose, but we must formulate it in such a way that it leaves itself open to empirical tests.  The empirical tests and experiments (EE) that we devise for the theory have the aim of trying to falsify it.  These experiments usually generate further problems (P2) because they contradict other experimental findings, or they clash with the theory’s predictions, or they cause us to widen our questions.  The new problems give rise to a new tentative theory and the need for more empirical testing.

Popper thus gives empirical experiments and observation a completely different role: their job now is to test a theory, not to prove it, and since this is a deductive approach it escapes the problem of induction. Popper takes advantage of the asymmetry between verification and falsification: while no number of empirical observations can ever prove a theory is true, just one such observation can prove that it is false.  All you need is to find one black swan and the theory “All swans are white” is disproved.

Appendix 2: Empiricism and epistemology

Moving to Patrick’s second question, I meant to say that “pure” or “extreme” forms of empiricsm are now generally rejected. Those who adopt a relativist epistemology (e.g most post-modernists) and those who are ignorant of the philosophy of science (e.g. Thornbury) wrongly label their opponents (rationalists who base their arguments on logic and empirical observation) as “positivists”. In fact, nobody in the scientific community is a positivist these days. The last wave of positivists belonged to the famous Vienna Circle. The objective of the members of the Vienna Circle was to continue the work of their predecessors (most importantly Comte and Mach) by giving empiricism a more rigorous formulation through the use of recent developments in mathematics and logic. The Vienna circle, which comprised Schlick, Carnap, Godel, and others, and had Russell, Whitehead and Wittgenstein as interested parties (see Hacking, 1983: 42-44), developed a programme labelled Logical Positivism, which consisted first of cleaning up language so as to get rid of paradoxes , and then limiting science to strictly empirical statements: in the grand tradition of positivism they pledged to get rid of all speculations on “pseudo problems” and concentrate exclusively on empirical data.   Ideas were to be seen as “designations”, terms or concepts, that were formulated in words that needed to be carefully defined in order that they be meaningful, rather than meaningless. The logical positivists are particularly well-known for their attempt to answer Hume’s criticism of induction through Probability Theory, which, crudely, proposed that while a finite number of confirming instances of a theory could not prove it, the more numerous the confirming instances, the more probability there was that the theory was true. This, like just about all of their work, ended in failure.

Empiricism in Linguistics: Behaviourism  

But empiricism lived on, and in linguistics, the division between “empiricist” and “rationalist” camps is noteworthy. The empiricists, who held sway, at least in the USA, until the 1950s, and whose most influential member was Bloomfield, saw their job as field work: accompanied with tape recorders and notebooks the researcher recorded thousands of hours of actual speech in a variety of situations and collected samples of written text. The data was then analysed in order to identify the linguistic patterns of a particular speech community.  The emphasis was very much on description and classification, and on highlighting the differences between languages.  We might call this the botanical approach, and its essentially descriptive, static, “naming of parts” methodology depended for its theoretical underpinnings on the “explanation” of how we acquire language provided by the behaviourists.

Behaviourism was first developed in the early twentieth century by the American psychologist John B. Watson, who, influenced by the work of Pavlov and Bekhterev on conditioning of animals, attempted to make psychological research “scientific” by using only objective procedures, such as laboratory experiments which were designed to establish statistically significant results. Watson formulated a stimulus-response theory of psychology according to which all complex forms of behaviour are explained in terms of simple muscular and glandular elements that can be observed and measured.  No mental “reasoning”, no speculation about the workings of any “mind”, were allowed. Thousands of researchers adopted this methodology, and from the end of the first world war until the 1950s an enormous amount of research on learning in animals and in humans was conducted under this strict empiricist regime.  In 1950 behaviourism could justly claim to have achieved paradigm status, and at that moment B.F. Skinner became its new champion.  Skinner’s contribution to behaviourism was to challenge the stimulus-response idea at the heart of Watson’s work and replace it by a type of psychological conditioning known as reinforcement.  Important as this modification was, it is Skinner’s insistence on a strict empiricist epistemology, and his claim that language is learned in just the same way as any other complex skill is learned, by social interaction, that is important here.

In sharp contrast to the behaviourists and their rejection of “mentalistic” formulations is the rationalist approach to linguistics championed by Chomsky. Chomsky (in 1959 and subsequently) argued that it is the similarities among languages, what they have in common, that is important, not their differences. In order to study these similarities we must allow the existence of unobservable mental structures and propose a theory of the acquisition of a certain type of knowledge.

Well, you know the story: Chomsky’s theory was widely adopted and became the new paradigm. Currently, badly-informed people like Larsen-Freeman and Thornbury (as opposed to serious scholars like O’Grady, MacWhinney and others) are claiming that no appeals to innate, unobservable mental processes or to modules of mind are necessary to explain language learning. What they don’t appreciate is that, unless, like William O’Grady or Brian MacWhinney, they deal properly with epistemological questions about the status of psychological processes, mental states, mind versus brain, and so on, they are either trying to have their cake and eat it or adopting an untenable empiricist epistemology.



Gregg, K.R. (2010) Shallow draughts: Larsen-Freeman and Cameron on complexity. Second Language Research, 26(4), 549 – 560.

Hacking, I. (1983) Representing and Intervening. Cambridge: Cambridge University Press.

Hume, D. (1988) [1748]: An Enquiry Concerning Human Understanding. Amherst, N.Y.  Promethius.

Popper, K. R. (1959) The Logic of Scientific Discovery. London: Hutchinson.

Popper, K. R. (1963) Conjectures and Refutations. London: Hutchinson.

Popper, K. R. (1972) Objective Knowledge. Oxford: Oxford University Press.

Popper, K. (1974) Replies to Critics in P.A. Schilpp (ed.), The Philosophy of Karl Popper. Open Court, La Salle, III.

Thornbury, S. (2013) ‘The learning body,’ in Arnold, J. & Murphey, T. (eds.) Meaningful Action: Earl Stevick’s influence on language teaching. Cambridge. Cambridge University Press.

Thornbury, S. (2012?) Language as an emergent system. British Council, Portugal: In English. Available here:



Scheffler on The Lexical Approach

In January 2015, ELTJ published a commentary with the title Lexical priming and explicit grammar in foreign language instruction, which provoked a reply and counter-reply in subsequent issues. Scheffler (2015) argues that, pace Hoey’s theory of lexical priming, “lexis should be subordinated to grammar in FL teaching.”  Scheffler reminds us that Hoey sees lexical priming as “the mechanism that drives language acquisition”; that the successful language learner recognises, understands and produces lexical phrases as ready-made chunks; and that, consequently, teachers should concentrate on vocabulary in context and particularly on fixed expressions in speech. Scheffler’s reply is that mastery of lexical associations takes too long to be a viable objective for classroom-based foreign language learning and that grammar-based teaching is more efficacious.

According to Scheffler, in order to reach proficiency through learning lexical chunks, EFL learners have two options: either they use the same subconscious mechanism that operates in L1 acquisition, or they consciously apply themselves to the study of appropriate language material. As to the first option, studies of first language acquisition cited by Scheffler show that by the time they’re five years old children have encountered more than twelve million meaningful utterances in communicative context.  Scheffler comments: “No classroom input can come close to this amount of linguistic data”. Regarding the second option, mastering lexical associations through the conscious study of chunks,  collocations, etc., Scheffler cites Pawley and Syder’s (1983) claim that native speakers know ‘hundreds of thousands’ of memorized sequences, and argues that “in the classroom setting, committing to memory even a small subset of these would be a daunting task.”

So, says Scheffler, given that the subconscious or conscious learning of lexical chunks is not a viable option, classroom time should be spent focusing more on grammatical systems than on lists of lexical phrases. He cites Spada and Tomita’s (2010) meta-analysis as convincing evidence that explicit grammar instruction is more effective than implicit instruction for both simple and complex English grammar structures. Scheffler concludes by suggesting that classroom EFL teaching should provide “a combination of grammatically oriented presentation and lexically oriented practice” involving explicit grammar explanations followed by practice to encourage lexical priming. Schleffer offers this example: the teacher presents and explains the present perfect as a grammatical category and then provides practice in the form of drills. The drills allow links between the most frequent lexical instantiations of the present perfect  to be established, and these links are further strengthened in communicative activities and in exposure outside the classroom.  Such procedures offer learners both explicit and implicit instruction. Explicit instruction aims at linguistic awareness, proceduralization of explicit knowledge, and lexical priming, while implicit instruction reinforce primings established in class and gradually create new ones.

I’d say that Scheffler’s suggestion is a bit of a dog’s dinner, mutton dressed as lamb, old wine in new bottles, a botched attempt to have your cake and eat it. Rather than effortlessly trot out more examples of my store of food-and-drink-related-put-downs, let me be more specific:

First, no attempt is made to evaluate Hoey’s theory of language or of SLA. What are the strengths and weaknesses of a theory which makes collocational priming the key construct in both a description of language and an explanation of how it is learned?

Next, Spada and Tomita’s (2010) meta-analysis in no way supports the view that the presentation and practice of grammatical categories is the best, or even a good, way to organise classroom-based ELT. The authors limit themselves to the claim that adult learners sometimes benefit from explicit attention to form and from opportunities to practise explicit knowledge.  Nothing in Spada and Tomita’s review suggests that organising a syllabus around the presentation and practice of pre-selected bits of grammar like the present perfect is recommendable; nothing in the review challenges the findings of studies which support the view that most explicit grammar teaching falls on deaf ears most of the time, and that teaching can affect the rate but not the route of interlanguage development.

Finally, Scheffler begins by saying that basing ELT on the implicit or explicit learning of lexical chunks is unrealistic because there’s too much to learn. That’s a good point, but why then does he end up trying to somehow cram all this lexical chunk learning into the last part of the lesson? The claim is that in a grammar-based PPP approach where communicative activities are included, the explicit grammar teaching will help lexical priming, while the communicative activities will reinforce primings and create new ones. I can see no reason for thinking that this would work. And, of course, it contradicts Hoey’s theory. Grammar views lexical items as isolated elements organised by syntax; and this is exactly the view that Hoey wants to challenge. What sense does it make to expect the explicit teaching of the forms of the present perfect to facilitate fabulous amounts of lexical priming?

Despite its shortcoming, Scheffler’s article does draw attention to the fact that no proper account has ever been given by proponents of the lexical approach of how exposure to massive amounts of what Lewis (1993) calls “suitable input” should be organised into a syllabus.  Walkley and Dellar have published coursebooks which they claim exemplify the lexical approach, but I’ve never managed to get review copies and I can’t bring myself to buy them. As far as I can gather from Dellar’s public pronouncements , he thinks teaching should concentrate on giving learners repeated exposure to the most frequent words in English in context, but important questions remain unanswered.

  1. How should the repeated exposure to massive numbers of lexical chunks be organised? Is frequency of occurrence in the biggest corpora the only criterion for the selection and presentation of lexical items, or are there others?
  2. How do teachers make classroom sessions dedicated to “repeated exposure to the most frequent words in English” interesting and motivating?
  3. How much input do learners need? How do Walkley and Dellar respond to the research findings which suggest that it’s unreasonable to expect FL classroom learners to remember even a small subset of what native speakers know? Sinclair (2004: 282) warns of “the risk of a combinatorial explosion, leading to an unmanageable number of lexical items” and Harwood (2002: 142) warns against “learner overload”, insisting that “implementing a lexical approach requires a delicate balancing act” between exploiting the richness of fine-grained corpus-derived descriptions and keeping the learning load at a manageable level.
  4. How do teachers help learners notice and store the thousands of lexical chunks which are required for a minimum level of proficiency? Put another way, how do teachers help learners turn massive loads of input into an ability to use the language for effective communication?

The fact is that, pace Hoey and Lewis, L1 acquisition is not the same as the acquisition which takes place in FL classrooms. As Granger (2011) points out, Lewis claims that “phrases acquired as wholes are the primary resource by which the syntactic system is mastered” (Lewis (1993: 95). This assertion, frequently found in the Lexical Approach literature, is based on L1 acquisition studies which demonstrate that children first acquire chunks and then progressively analyse the underlying patterns and generalize them into regular syntactic rules (Wray 2002).  But (as Wray points out in her overview of findings on formulaicity in SLA), in classroom-based L2 acquisition, learners don’t get enough exposure for the ‘unpacking’ process to take place, and as a result formulaic sequences don’t contribute, as they do in L1 acquisition, to the mastery of grammatical forms. Granger concludes that “while lexical phrases are likely to have some generative role in L2 learning, it would be a foolhardy gamble to rely primarily on the generative power of lexical phrases.” She goes on to cite Pulverness (2007: 182-183) who points to the risk of the ‘phrasebook effect’, whereby lexical items accumulate in an arbitrary way as learners get presented with an ever-expanding lexicon without being given a structural framework within which to make use of all the lexis.

Scheffler’s main objective is to defend the grammar-based syllabus against Hoey’s  suggestion that lexis should be subordinated to grammar in FL teaching. I think he does a poor job of it, but at least he gives voice once again to some unanswered questions. We’ve been waiting for an answer for a while now, and I doubt that we’ll get one any time soon. Meanwhile, perhaps we should turn our attention to the newer but somehow more interesting question of what we are to make of Hoey’s collocational priming: Can this construct lead to a satisfactory explanation of both language and of SLA?


Granger, S. (2011) From phraseology to pedagogy: Challenges and prospects. In: Uhrig, P., Chunks in the Description of Language. A tribute to John Sinclair.  Mouton de Gruyter : Berlin and New York.

Harwood, N. (2002) Taking a lexical approach to teaching: principles and problems. International Journal of Applied Linguistics 12/2: 139-155.

Lewis, M. (1993). The Lexical Approach. The State of ELT and a Way Forward. Hove: Language Teaching Publications.

Pawley, A. and Syder, F. (1983) Two puzzles for linguistic theory: nativelike selection and nativelike fluency. in J. C. Richards and R. Schmidt (eds.) Language and Communication. London: Longman.

Pulverness, A. (2007) Review of McCarthy, M. & F. O’Dell, English Collocations in Use, ELT Journal 61: 182-185.

Scheffler, P. (2015) Lexical priming and explicit grammar in foreign language instruction. ELT  Journal, 69,1, 93-96.

Sinclair, J. (ed.) (2004) How to use corpora in language teaching. Benjamin : Amsterdam.

Spada, N. and Tomita, Y. (2010) Interactions between type of instruction and type of language feature: a meta-analysis. Language Learning 60/2: 263–308.

Wray, A. (2002) Formulaic Language and the Lexicon. Cambridge: Cambridge University Press.

A Sketch of a Process Syllabus



Winston Churchill made the good point that “if a job’s worth doing, it’s worth doing badly.” Rather than work patiently and diligently on a carefully-crafted Process syllabus which perfectly captures the progressive educational principles that underlie it, I offer this very elementary sketch.


In Freire’s (2000) view of adult education, personal freedom and the development of individuals can only occur mutually with others: Every human being, no matter how ‘ignorant’ he or she may be, is capable of looking critically at the world in a ‘dialogical encounter’ with others. In this process, the old, paternalistic teacher-student relationship is overcome.

To paraphrase Breen (1987), the Process syllabus prioritises classroom decision-making on the assumption that participation by learners in decision-making will be conducive to learning. Decision-making can be seen as an authentic communicative activity in itself. The objective of the Process syllabus is to serve the development of a learner’s communicative competence in a new language by calling upon the communicative potential which exists in any classroom group. It is based on the principle that authentic communication between learners will involve the genuine need to share meaning and to negotiate about things that actually matter and require action on a learner’s part. The Process syllabus proposes that metacommunication and shared decision-making are necessary conditions of language learning in any classroom.

What Does the Process Syllabus Provide?

Two things: a plan relating to the major decisions which teacher and learners need to make during classroom language learning, and a bank of classroom activities which consist of sets of tasks.

The plan consists of answers to questions which the teacher and learners discuss and agree on together. Questions refer to the purposes of language learning; the content or subject matter which learners will work upon; the ways of working in the classroom ; and means of evaluation of the efficiency and quality of the work and its outcomes. Clearly, decisions made about these areas will relate one to the other and they will generate the particular process syllabus of the classroom group. They will also lead to agreed working procedures within the class; a ‘working contract’ to be followed for an agreed time, evaluated in terms of its helpfulness and appropriateness, and subsequently refined or adapted for a further agreed period of time. This joint decision-making will lead to a particular selection of activities and tasks.

As for the materials bank, this needs to be built by each local centre, taking into consideration the known general and specific needs of its students. While today a great deal of material can be found on the internet, there’s no denying that implementing a process syllabus requires an initial investment in both materials and  teacher time in order to assemble and organise a good materials bank. In 1987, Breen had this to say:

A classroom group adopting a Process syllabus would deduce and implement its own content syllabus; a syllabus of subject-matter in the conventional sense would be designed, implemented, and evaluated within the Process syllabus. In circumstances where an external pre-planned syllabus already existed and had to be undertaken by the teacher with his or her learners, the decisions for classroom language learning would be related directly to such a pre-planned syllabus. As a result, the external syllabus may be incorporated within the group’s process — with or without modifications as decided upon by the group — and used as a continual reference point – or source of helpful criteria — during decision-making and evaluation. It is more than likely that any external syllabus will be modified as the group works with it. In sum, the Process syllabus is a context within which any syllabus of subject-matter is made workable. 

I think it’s very important to do without any external syllabus, so my proposal is based on the assumption that the teacher can call on a rich diversity of well-organised materials, where the diversity and the organisation are both crucial factors. Currently, few ELT centres provide such a materials bank, and this fact has led Rose Bard and I to the conclusion that we need to build a materials bank comprised entirely of materials which can be downloaded from a dedicated web site. The materials bank will grow and teachers will, of course, supplement the central hub with their own locally produced materials. The  “central” materials will be organised in a data base using the following provisional fields:

  1. Access number
  2. Source
  3. Medium
  4. Activity type
  5. Level
  6. Topic
  7. Sub-topic
  8. Grammar area
  9. Function

The Access Number is a simple code which defines the level of difficulty (l-6), the ‘medium’ (‘A’ or audio; ‘V’ for video; ‘T’ for text; ‘I’ for Internet;  etc.), and a sequential number of no other significance than to allow us to keep materials in order. Thus, for example, “4.V.19” is the nineteenth video segment for the fourth level of English. The creation of a database allows those creating the materials bank  to produce indexes for as many fields or combinations as they want, and this allows teachers to quickly see what video material is available at their level, or, for example, to find a reading text on tourism at that level. But teachers can also decide that they will concentrate on a certain topic, tourism, for example, and then confect a multi-activity task on that topic. The indexes tell them exactly what is available on this subject in each medium at each level. They can define the task for themselves or choose a ready-made task from the ‘Activities’ index. If they design their own task, they could start with a reading text, go on to an information-gap activity, then use video, then do an Internet  exercise, then move to discussions, presentations, and reports, staying all the time on the topic of tourism and at their chosen level.

So let’s look now at the example.

An Example Process Syllabus

In the following, I’ll refer to the teacher as “he”, supposing him to be a younger version of me.

Type of Student: Adult

Number of Students: 12

Level: Mid-Intermediate (CEFR: B2). The students should have done a proficiency test, ideally including an interview, prior to enrolling in the course.

Course Duration: 100 hours; 6 hours a week.

Objectives: The main objective of the course is to improve the students’ ability to use English for professional purposes. Priority is given to oral communication.

Step 1    

  1. The teacher greets everybody, introduces himself, and then asks students to get into four groups of 3 (A,B, & C) and to use questions displayed on the whiteboard in order to find out personal and professional information about each other. He then asks A to introduce B, B to introduce C, and C to introduce A.
  2. After that he explains that the course will use a process syllabus and gives a quick description of what that involves.
  3. (This bit of the class aims to reaffirm the feeling that the main thrust of the course is communicative oral practice.) “What’s in the news?”. The teacher brainstorms things in the news – local, national, international, sport, scandal, business, etc., – in order to generate a list of 6 or 7 items on the whiteboard. Students are put into groups of 3 and each group has to choose 1 of the items on the list. They then discuss the item and prepare a report. When everybody’s ready, the class gets back together and then each group gives its report (A gives the background, B gives the news, C gives the group’s opinion). The topic’s then open for others in the class to have their say.
  4. After a break, the teacher gives out a needs analysis worksheet: see here for possible format. (To get back to this page, hit the arrow at the top left of the screen) When everybody has filled in the form, the class divides into 3 groups and discusses their answers. The teacher asks one of each group to report on the answers to each of the 7 questions. This is a long activity, and will take the rest of the class time. The teacher collects the worksheets before everybody leaves.

Step 2

Before the next class, the teacher looks through the needs analysis forms and designs the next 10 hours of the course, confecting tasks from activities and materials in the materials bank. In fact he’s already done most of this work in previous courses, but just needs to fine-tune the tasks in line with the data collected.

Step 3

In the next classes the teacher leads students through a series of tasks involving activities such as problem-solving, information-gap, data gathering, case studies, role plays, presentations, debates, discussions, etc. involving group work, pair work, and whole class work, and using a variety of media. Various types of focus on form, vocabulary building, feedback and correction are used, and homework includes written work, and participation in an on-line discussion forum set up for this course.  Apart from their overt usefulness, the idea of these classes is to give students a taste of a wide variety of activities and formats.

During the classes the teacher tries to put into practice the Methodological Principles (MPs) re-stated in Long (2015), which I’ve summarised in a previous post. In fact, there are very important differences between Long’s TBLT syllabus proposal and this one, which will need discussing at some point, but I think the important points of agreement are:

  • MP2: Promote Learning by Doing. Practical hands-on experience with real-world tasks brings abstract concepts and theories to life and makes them more understandable);
  • MP4: Provide Rich Input. Adult foreign language learners require not just linguistically complex input, but rich input (i.e., realistic samples of discourse use surrounding accomplishment of tasks).
  • MP5: Encourage Inductive (“Chunk”) Learning. Learners need exposure to realistic samples of target language use and then helped to incorporate, store and retrieve whole chunks of that input as whole chunks.
  • MP6: Focus on Form. A focus on meaning can be improved upon by periodic attention to formal aspects of the language. This is best achieved during an otherwise meaning-focused lesson, and using a variety of pedagogic procedures where learners’ attention is briefly shifted to linguistic code features, in context, to induce “noticing”, when students experience problems as they work on communicative tasks. The most difficult practical aspect of focus on form is that, to be psycholinguistically relevant, it should be employed only when a learner need arises, thus presenting a difficulty for the novice teacher, who may not have relevant materials to provide. Where face-to-face interaction is the norm, as in L2 classrooms, recasting is the obvious pedagogical procedure. Once an L2 problem has been diagnosed for a learner, then pedagogical procedures may be decided upon and materials developed for use when the need next arises.
  • MP7: Provide Negative Feedback. Recasts are proposed as an ideal (but not the only) form of negative feedback.
  • MP8: Respect Developmental Processes and “Learner Syllabuses”. I’ve already said enough about this in previous posts dealing with interlanguage development.

Step 4

After approx. 12 hours of classroom time, the teacher holds “Feedback and Planning Session 1” where everybody reflects on what happened and plans the next part of the course. See here for possible format of the worksheet.  (To get back to this page, hit the arrow at the top left of the screen.) This session is obviously a pivotal part of the process syllabus and requires careful handling by the teacher . I think training in how to conduct these sessions is required for new teachers, and I’ll devote a separate post to discussing such sessions. Let me just say here that the teacher must avoid defending himself against any criticisms and must also avoid the temptation of the students to say “Everything’s fine: carry on!” With a bit of practice, these sessions become very dynamic and rewarding encounters, and succeed in giving the teacher a good idea of how to proceed.  I think it’s a good idea for the teacher to video-record the session so as to be freed of the task of taking notes.

Step 5   

Before the next class, the teacher assimilates the data from the planning session and designs the next 20 hours of the course, again confecting tasks from activities and materials in the materials bank, but this time based on what the students have indicated they want to do.

Step 6

The teacher presents his plan at the next class, and proceeds with its implementation. At around Hour 30 there’s “Planning Session 2” where feedback is again sought and the next part of the course is planned. The whole course thus comprises of about 5 cycles.

As for assessment, I refer you to final section of my post on Test Validity where I briefly summarise Fulcher’s important distinction between large-scale testing and classroom assessment. To the extent that students need an external assessment of their current language proficiency when they finish the course, they have various alternatives, such as those offered by TOEFL, or the Cambridge Examination Board.


This is a rough sketch of a possible process syllabus and I’m aware that it raises lots of important questions. Most importantly, in my opinion, it dispenses with any proper needs analysis and relies on the use of what Long (2015) would call a “hit and miss” approach to materials and task design. But it’s a start: it flies a very fragile kite.

To the objection that the syllabus relies on the existence of a materials bank, I can only reply that there is an abundance of cheap or free material available for ELT these days, but I agree that it needs organising for a particular school’s or institute’s needs. Similarly, to the objection that the teacher is expected to do much work preparing the tasks, my reply is that it doesn’t actually involve that much work, although an initial effort is certainly required. As the saying goes: Where there’s a will there’s a way.

The most important element in the proposal is the negotiation between teacher and students and it’s this element which needs to be tested by being put into practice by teachers in their local environments. I’ve done it myself and I know lots of other teachers who’ve done it, but more serious study is required to properly test the assertions made here. In my experience, students soon get used to the new roles, and the inevitable initial scepticism is soon overcome. The students’ contribution to decision-making, and everybody’s appreciation of the new approach, grows as the course develops; it is, indeed a virtuous circle.  I’m aware of the need to take into account local cultural issues, but, on the other hand, we should not bow to stereotypes. There are schools in England where students are punished for speaking out of turn or for challenging the authority of the teacher, and there are schools in South Korea where students are encouraged to contribute to decisions about what and how they learn. The principles underlying a process syllabus reflect a libertarian educational philosophy which in recent times has perhaps been best articulated by Friere, but which has echoes in all the major cultures of the world.


Breen, M.P. (1987) Contemporary Paradigms in Syllabus Design Part II. Language Teaching, 20, pp 157-174.

Friere, P. (2000) Pedagogy of the Oppressed: 30th Anniversary Edition. Bloomsbury Academic.

Long. M. H. (2014) Second Language Acquisition and Task-Based Language Teaching. Wiley-Blackwell.

Dellar and Lexical Priming


In a recent webinar (which I read about in a post by Leo Selivan) Hugh Dellar talked about colligation. I missed the webinar and I found Selivan’s report of it confusing, so I took a look at the slides Dellar used.  Early on in his presentation, Dellar quotes Hoey (2005, p.43)

The basic idea of colligation is that just as a lexical item may be primed to co-occur with another lexical item, so also it may be primed to occur in or with a particular grammatical function. Alternatively, it may be primed to avoid appearance in or co-occurrence with a particular grammatical function. 

I don’t know how Dellar explained Hoey’s use of the term “primed” in his webinar, but I understand priming to be based on the idea that each word we learn becomes associated with the contexts with which we repeatedly encounter it, so much so that we subconsciously expect and replicate these contexts when we hear and speak the words. The different types of information that the word is associated with are called its primings.

What does Hoey himself say? Hoey says that we get all our knowledge about words (their collocations, colligations, and so on) by subconsciously noticing everything that we have ever heard or read, and storing it in memory.

The process of subconsciously noticing is referred to as lexical priming. … Without realizing what we are doing, we all reproduce in our own speech and writing the language we have heard or read before. We use the words and phrases in the contexts in which we have heard them used, with the meanings we have subconsciously identified as belonging to them and employing the same grammar. The things we say are subconsciously influenced by what everyone has previously said to us (Hoey, 2009 – Lexical Priming)  

Hoey rejects Chomsky’s view of L1 acquisition and claims that children learn language starting from a blank slate and then building knowledge from subconsciously noticed connections between lexical items. All language learning (child L1 and adult SLA alike) is the result of repeated exposure to patterns of text, where the more the repetition, the more chance for subconscious noticing, and the better our knowledge of the language.

The weaknesses of this theory include the following:

  • Hoey does not explain the key construct of subconscious noticing;
  • he does not explain how the hundreds of thousands of patterns of words acquired through repeatedly encountering and using them are stored and retrieved;
  • he does not acknowledge any limitations in our ability to remember, process or retrieve this massive amount of linguistic information;
  • he does not reply to the argument that we can and do say things that we haven’t been trained to say and that we have never heard anybody else say, which contradicts the claim that what we say is determined by our history of priming.
  • while Hoey endorses Krashen’s explanation of SLA (it’s an unconscious process dependent on comprehensible input), Krashen’s Natural Order Hypothesis contradicts Hoey’s lexical priming theory, since, while the first claims that SLA involves the acquisition of grammatical structures in a predictable sequence, the second claims that grammatical structures are lexical patterns and that there is no order of acquisition.

These limitations in Hoey’s theory get no mention from Dellar, who, having previously modelled his lexical approach on Michael Lewis, now seems to have fully embraced Hoey’s lexical priming theory. Let’s look at how this theory compares to rival explanation. (I’m here making use of material I’ve used in previous posts about Dellar & Hoey.)

Interlanguage Grammar versus Lexical Priming

In the last 40 years, great progress has been made in developing a theory of SLA based on a cognitive view of learning. It started in 1972 with the publication of Selinker’s paper where he argues that the L2 learners have their own autonomous mental grammar which came to be known as interlanguage grammar, a grammatical system with its own internal organising principles, which may or may not be related to the L1 and the L2.

One of the first stages of this interlanguage to be identified was that for ESL questions. In a study of six Spanish students over a 10-month period, Cazden, Cancino, Rosansky and Schumann (1975) found that the subjects produced interrogative forms in a predictable sequence:

  1. Rising intonation (e.g., He works today?),
  2. Uninverted WH (e.g., What he (is) saying?),
  3. “Overinversion” (e.g., Do you know where is it?),
  4. Differentiation (e.g., Does she like where she lives?).

A later example is in Larsen-Freeman and Long (1991: 94). They pointed to research which suggested that learners from a variety of different L1 backgrounds go through the same four stages in acquiring English negation:

  1. External (e.g., No this one./No you playing here),
  2. Internal, pre-verbal (e.g., Juana no/don’t have job),
  3. Auxiliary + negative (e.g., I can’t play the guitar),
  4. Analysed don’t (e.g., She doesn’t drink alcohol.)

In developing a cognitive theory of SLA, the construct of interlanguage became central to the view of L2 learning as a process by which linguistic skills become automatic. Initial learning requires controlled processes, which require attention and time; with practice the linguistic skill requires less attention and becomes routinized, thus freeing up the controlled processes for application to new linguistic skills. SLA is thus seen as a process by which attention-demanding controlled processes become more automatic through practice, a process that results in the restructuring of the existing mental representation, the interlanguage.

So there are two rival theories of SLA on offer here: Hoey’s theory of lexical priming (supported by Dellar, Selivan and others) and Selinker’s theory of interlanguage (developed by Long, Robinson, Schmidt, Skehan, Pienemann and others). Dellar should resist giving the impression that Hoey’s theory is the definitive and unchallenged explanation of how we learn languages.

Errors and L1 priming

in his presentation Dellar says “All our students bring L1 primings” and gives these examples from Polish.

On chce zebym studiowal prawo.

Zimno mi.

Jak ona wyglada?

These L1 primings “colour L2”

He wants that I study Law.

It is cold to me.

How does she look?

Dellar says that these are not grammar errors, but rather “micro-grammatical problems” caused by a lack of awareness of how the words attach themselves to grammar. The solution Dellar offers to these problems is to provide learners with lots of examples of “correct colligation and co –text”.

He wants me to study Law.

My dad’s quite pushy. He wants me to study Business, but I’m not really sure that I want to.

It’s really cold today.  It’s freezing!  I’m freezing!

What does she look like?  Oh, she’s quite tall . . . long hair . . . quite good-looking, actually. Well, I think so anyway.

This kind of correction is, says Dellar, “hard work, but necessary work”. It ensures that “students are made aware of how the way they think the language works differs from how it really works.” Dellar concludes that

 Hoey has shown the real route to proficiency is sufficient exposure. Teachers can shortcut the priming process by providing high-reward input that condenses experience and saves time.

We may note how Hoey, not Krashen, gets the credit for showing that the real route to proficiency is sufficient exposure; how priming now explains learning; and how teaching must now concentrate on providing shortcuts to the primimg process.

To return to Dellar’s “micro-grammatical problems”, we are surely entitled to ask if what SLA researchers for 50 years have referred to as the phenomenon of L1 transfer is better understood as the phenomenon of L1 primings. Recall that Pit Corder argued in 1967 that learner errors were neither random nor best explained in terms of the learner’s L1; errors were indications of learners’ attempts to figure out an underlying rule-governed system.  Corder distinguished between errors and mistakes: mistakes are slips of the tongue and not systematic, whereas errors are indications of an as yet non-native-like, but nevertheless, systematic, rule-based grammar.  Dulay and Burt (1975) then claimed that fewer than 5% of errors were due to native language interference, and that errors were, as Corder suggested, in some sense systematic.  The morpheme studies of Brown in L1 (1973) led to studies in L2 which suggested that there was a natural order in the acquisition of English morphemes, regardless of L1.  This became known as the L1 = L2 Hypothesis, and further studies all pointed to systematic staged development in SLA.  The emerging cognitive paradigm of language learning perhaps received its full expression in Selinker’s (1972) paper which argues that the L2 learners have their own autonomous mental grammar (which came to be known, pace Selinker, as interlanguage (IL) grammar), a grammatical system with its own internal organising principles, which may or may not be related to the L1 and the L2.

All of this is contradicted by Dellar, who insists that L1 priming explains learner errors. 

Language development through L2 priming  versus processing models of SLA

Explaining L2 development as a matter of strengthening L2 primings between words contradicts the work of those using a processing model of SLA, and I’ll give just one example. McLaughlin (1990) uses the twin concepts of “Automaticity” and “Restructuring” to describe the cognitive processes involved in SLA. Automaticity occurs when an associative connection between a certain kind of input and some output pattern occurs.   Many typical greetings exchanges illustrate this:

Speaker 1: Morning.

Speaker 2: Morning. How are you?

Speaker 1: Fine, and you?

Speaker 2: Fine.

Since humans have a limited capacity for processing information, automatic routines free up more time for such processing. To process information one has to attend to, deal with, and organise new information.  The more information that can be handled routinely, automatically, the more attentional resources are freed up for new information.  Learning takes place by the transfer of information to long-term memory and is regulated by controlled processes which lay down the stepping stones for automatic processing.

The second concept, restructuring, refers to qualitative changes in the learner’s interlanguage as they move from stage to stage, not to the simple addition of new structural elements. These restructuring changes are, according to McLaughlin, often reflected in “U-shaped behaviour”, which refers to three stages of linguistic use:

  • Stage 1: correct utterance,
  • Stage 2: deviant utterance,
  • Stage 3: correct target-like usage.

In a study of French L1 speakers learning English, Lightbown (1983) found that, when acquiring the English “ing” form, her subjects passed through the three stages of U-shaped behaviour.  Lightbown argued that as the learners, who initially were only presented with the present progressive, took on new information – the present simple – they had to adjust their ideas about the “ing” form.  For a while they were confused and the use of “ing” became less frequent and less correct.

According to Dellar (folowing Hoey) this “restructuring” explanation is wrong: what’s actually happening is that the L2 primings are not getting enough support from “high-reward input”.


There are serious weaknesses in the lexical priming theory as a theory of SLA, and few reasons to think that it offers a better explanation of the phenomena studied by SLA scholars, including the phenomenon of L1 transfer, than processing theories which use the construct of interlanguage grammar. Even if there were, Dellar seems not to have grasped that his newly-adopted explanation of language learning and his long-established teaching methods contradict each other. If lexical priming is a subconcious process which explains language learning, then the sufficient condition for learning is exposure to language and opportunities to strengthen and extend lexical primings. All the corrective work that Dellar recommends, all that “hard but necessary work” to ensure that “students are made aware of how the way they think the language works differs from how it really works” is useless interference in a natural process involving the unconscious acquisition of lexical knowledge.



Cazden, C., Cancino, E., Rosansky, E. and Schumann, J. (1975) Second language acquisition sequences in children, adolescents and adults. Final report submitted to the National Institute of Education, Washington, D.C.

Corder, S. P. (1967) The significance of learners’ errors. International Review of Applied Linguistics 5, 161-9.

Dulay, H. and Burt, M. (1975) Creative construction in second language learning and teaching. In Burt, M and Dulay, H. (eds.), New directions in second language learning, teaching, and bilingual education. Washington, DC: TESOL, 21-32.

Hoey, M. (2005) Lexical Priming: A New Theory of Words and Language. London: Routledge.

Krashen, S. (1981) Second language acquisition and second language learning. Oxford: Pergamon.

Larsen-Freeman, D. and Long, M. H. (1991) An introduction to second language acquisition research. Harlow: Longman.

McLaughlin, B. (1990) “Conscious” versus “unconscious” learning. TESOL Quarterly 24, 617-634.

Selinker, L. (1972) Interlanguage.  International Review of Applied Linguistics 10, 209-231.

British Council Cultural Claptrap


In an article for the British Council website, Ian Clifford asks two questions:

Do learner-centred approaches work in every culture?

Is it time to challenge Western assumptions about education, especially when it comes to promoting ‘good teaching approaches’ in the developing world?

I bet you won’t fall off your chair when I tell you that Clifford thinks the answers to these questions are “No” and “Yes”.

Clifford starts by saying that most Western educators think learner-centred education represents “everything that’s good and wholesome in education.” Just in case that sounds a bit blasé, Clifford gets more scholarly and says that learner-centred educational practice can be traced back to ‘child-centred’ education which

draws on the work of 18th century philosophers such as Rousseau and Locke, who suggested that teachers should intervene as little as possible in the natural development of children.”

Unfortunately, this attempt at scholarship fails, since in fact Rousseau and Locke suggested the opposite. They were both pioneers in promoting child education where the teacher held absolute authority, and, in Locke’s case, children were expected to do exactly as they were told by teachers under threat of dire corporal punishment.

Clifford then asks “What exactly do these (learner-centred) approaches amount to in the classroom?” The answer to this important question is that some educators associate learner-centred approaches with group work, some think it means teachers let learners find out for themselves; and some can’t identify any method at all. You might think that this is not a very “exact” answer, but never mind, because the main point is to establish the different perceptions of learner-centred and teacher-centred approaches. While a learner-centred approach represents everything good, a teacher-centred approach is generally regarded as

“authoritarian and hierarchical, encouraging rte learning and memorisation, without any real understanding.”

Proceding with his absurd parody of the two approaches (in the West we blithely abandon learners to their own devices, while in the East learners bang away at drills and memorise things without achieving real understanding), Clifford cites Kirschner’s work,  which shows that

“leaving learners to solve problems for themselves leads to brain overload.”

Pretty persuasive evidence, don’t you think? And as if this scary brain overload weren’t reason enough to bury learner-centred approaches once and for all, Clifford goes on to give a skewed summary of two more studies. First, Clifford claims that a 2014 meta-study

“favoured ‘direct teaching’ over approaches that involved little teacher instruction such as ‘discovery learning’.”

Quite apart from the fact that “discovery learning” is not a method associated with ELT, if you click on the link and read the summary at the top for yourself, you’ll see that that’s not an accurate summary of the findings. Then Clifford cites Schweisfurth (2011) , who reviewed 72 articles about projects promoting “student-centred” approaches and concluded that they record ‘a history of failures great and small’. Clifford says that the “most important” reason for failure is “cultural mismatch.” He explains:

“Approaches to teaching based on a Western idea of the individual don’t fit well in cultures which emphasise group goals over individual needs. In such cultures, teachers are expected to be authoritative and learners obedient.”

This is a lazy, inaccurate, and misleading report of the findings. Nowhere in the entire article does Schweisfurth use the term “cultural mismatch”, nor does she say that cultural divergence is the most important reason for failure in any of the 72 cases studied. And of course she says absolutely nothing to warrant Clifford’s crude claim about the assumed roles of teachers and learners. On the contrary, she calls for analyses which “help to take us beyond the crude binary codes of Teacher-Centred Education versus Learner-Centred Education, or implementation success versus failure.”

Finally, Clifford gives “The case of Burma”, where, he says, various attempts to implement a ‘child-centred approach’ have failed. Who do you think have been called in to sort out the accumulated mess caused by the well-intentioned but misguided advocates of a learner-centred approach? Yes! The British Council – that hallowed institution famed for subordinating promotion of its own national culture to the greater mission of fostering global cultural diversity! Clifford proudly tells us that the British Council’s “English for Education College Trainers” project in Burma is going to

support local teachers to do whole-class teaching more effectively and interactively and in the second half of the year teach techniques to get learners learning from each other.”

Isn’t that just peachy, as we say in Henley on Thames.

Coming through, loud and clear, through the poor scholarship, the cherry-picking use of evidence, and the reliance on absurd straw-men versions of learner-centred and teacher-centred approaches, is a clear message. The West has been duped by lefty-liberals into accepting a dangerous, learner-centred approach to education as its paradigm, and it’s now trying to hoist this approach onto counties whose cultures make its implementation doomed to failure. We need to reject learner-centred approaches and go back to traditional “whole-class teaching”. The message is what you’d expect from a spokesman of the British Council (conservative, cautious, resistant to change) and it typically gets everything wrong. Pace Clifford, the West is not in the grips of a learner-centred paradigm in education, and there must be very few professionals working outside the cosy confines of the British Council who nurture such a paranoid illusion. More specifically, a learner-centred approach to ELT is not widespread in the West; rather, as I’ve argued elsewhere, ELT practice is mostly teacher-led and coursebook-driven. And while there is undeniably cultural resistance in many countries to the full implementation of a communicative approach to classroom language learning, surely we should be looking for ways to overcome this resistance rather than using opportunistic interpretations of multiculturalism to perpetuate the problem.

Three Cheers


A quick post to give my support to three good ventures currently doing their best to improve the ELT world.

1. SLB Cooperative

The SLB Cooperative, founded by Neil McMillan, provides language services to companies, organisations and individuals in Barcelona and beyond. Their objective is to provide a whole range of resources and professional development opportunities to members and associates, which in turn enables them to deliver the best possible service to clients. They do English classes, translation, editing, proofreading, and other language services, and they also offer low-cost or free language classes and translation services to those individuals in the community who are otherwise unable to afford them.

As they say on their website, the SLB is “a forward-thinking organisation, not a traditional school or language academy.” Being a cooperative, they cut out the middle-man, and offer clients a personalised service, at great value. What makes them different is that they work for themselves and for each other, and not under a manager who does not understand their jobs. At SLB, they value quality over quantity and service over sales.  To become a member, you contact them, and they invite you to an interview. If everybody is happy, you join. By paying your membership fee, you are a full member of the cooperative. Each associate has the same responsibilities, and the same right to vote. Each associate has the same share in any potential profits, if the cooperative vote to award dividends at the end of the financial year. They have very nice premises in Gracia, where members meet, pool resources, hold training sessions, and can use a well-equipped classroom. There’s a a great atmosphere in their centre, numbers are growing, and I’m going to join next week!

2. TEFL Equity Advocates

TEFL Equity Advocates opposes discrimination against non-native speaker teachers (NNESTs) and has quickly established itself as a powerful voice for change. Its blog has over 2,500 followers, and there’s no doubt that in the last 2 years they’ve made a huge contribution to the fight. Their aims are

  1. Acknowledge and expose the discrimination of NNESTs in TEFL.
  2. Sensitise the public to the problem.
  3. Debunk the most common and damaging myths and stereotypes about NNESTs.
  4. Reduce the number of job ads only for NESTs.
  5. Give self-confidence to NNESTs.

3. Decentralising Teaching and Learning  

Decentralising Teaching and Learning is Paul Walsh’s blog. He created the concept of “Decentralised ELT”, believing that the teaching of languages is over-centralised.  When asked to describe Decentralised Teaching in one sentence he says “The central tenet of Decentralised Teaching would be: Devolving power, resources and responsibility down to the learner in order to optimise learning.  He has a very good ‘dummies guide’  to his teaching methodology on the About page, and there are some great blog posts, free lesson plans and other resources on offer.

Thornbury and The Learning Body


Scott Thornbury has been talking about “The Learning Body” for a while now. You can see one version on YouTube and you can see another version at the ELTABB website. You can also read a fuller treatment in Thornbury’s chapter in the tribute to Earl Stevick: Meaningful Action   (Just BTW, it’s not a great collection.)  I base this critique on the YouTube talk.

Summary of the Talk

Thornbury starts by asserting that “Descartes got it wrong”. There is no mind/body dualism, rather “Brains are in bodies, bodies are in the world and meaningful action in these worlds is socially constructed and conducted” (Churchill et al, 2010). This devastating rebuttal of Descartes, which Thornbury (ignoring works by Locke, Hume, Derrida and others) reckons was “finally revealed” in 1994, has been ignored by those responsible for the prevailing orthodoxy in SLA, who insist that “language and language learning are a purely cognitive phenomenon.” Thornbury tells us that this orthodoxy claims that we need look no further than cognition for an explanation of SLA – other factors are not important.

Thornbury then goes on to explain that the modern view sees the brain as part of a larger set, involving the body and the world, leading to a new concept of “embodied cognition”. Without bothering with considerations of how “the mind” as a construct relates to the brain as a physical part of the body, Thornbury proceeds to look at the mind as embodied, embedded and extended.

Embodied Mind

The construct of the “embodied mind” is defined as “rooted in physical experience”. Our mind (see how hard it is, even for Thornbury, to stay away from Cartesian dualism) deals with ideas that are all related to our “physicality” as Thornbury puts it, and this applies to language and language learning. Key points here are:

  • “Language is rooted in human experience of the physical world”(Lee, 2010)
  • We adapt our language to different circumstances and different people.
  • Learning is enhanced by physical involvement.
  • Larsen-Freeman’s latest work argues that language is a dynamic emergent system.
  • Language is noted, applied and adapted in context.
  • Mirror neurons and body language are evidence for the embodied mind construct.

Embedded Mind

No definition of this construct is offered. Thornbury only says that language is embedded in context, which should come as a surprise to nobody. Thornbury refers to “ecolinguistics”, likens the learning of language to the learning of soccer by children, and reminds us that we adapt our language to different circumstances and different people.

Extended Mind

The “Extended mind” construct is nowhere even casually defined, but Scott uses the film Memento (a great film which I recommend, but which has little to do with Scott’s description or use of) to make the point that our bodies help us to remember. This is followed by a discussion of gestures, which have a big role in communication.


Not much here. Thornbury refers to the importance of our physical relationship to our students and says “Learning is discovering alignment”. This means group work, gesture, eye contact, “acting out”.


Thornbury gives this summary:

  • I think therefore I am: Wrong. Better:
  • I move therefore I am.
  • I speak therefore I move.
  • I move, therefore I learn.


Thornbury’s talk is interesting, and very well-delivered: he’s the best stand-up act in the business (sic) and his use of video clips is particularly good. But when you tot it all up, there’s almost nothing of substance, and the argument is hollow. Thornbury makes a straw man argument against research in SLA, and says nothing of much interest as to how all this “embodied” stuff might further our understanding of SLA. As to teaching, there’s absolutely no need to even mention “embodied cognition” in order to agree with all the good things he says about gestures and the rest of it. Earl Stevick was indeed concerned with holistic learning and a teaching methodology which reflected it, but I doubt he’d be impressed by this attempt to use fashionable speculations about cognitive science to back it up.

Specific Points

  1. The use of Descartes to promote an argument against current SLA research is simplistic and boringly trite. In the “Discourse on Method” Descartes was concerned with epistemology, with reliable knowledge. His famous conclusion “Cogito ergo sum” has never been falsified – how could it be! – and it’s plain silly to say that “he got it wrong”
  2. Thornbury says that SLA orthodoxy sees language learning as a purely cognitive phenomenon taking place in the mind. He’s wrong. The most productive research in SLA concentrates on cognitive aspects of SLA, but those involved in such research are quite aware that they’re focusing on just one aspect of the problem. They do so for the very good reason that scientific research gets the best results. The job of those who look at other aspects, such as those covered by sociolinguistics, is to show how their work has academic respectability, and misrepresenting the work of those who adopt a scientific methodology does nothing to enhance that job.
  3. The question of the distinction between the brain and the mind is a fundamental one. Thornbury doesn’t even mention it. .


Thornbury, following the muddled and generally incoherent arguments of Larsen Freeman, wants to say that SLA is best seen as an emerging process where, well, things emerge. And given that it all kind of emerges, ELT should help all these things, well, emerge. This is absolutely hopeless, isn’t it? Any theory of SLA must be sharper than this; any teaching methodology needs a firmer basis. There is, of course, a very interesting strand of SLA research that takes an emergentist approach, but it has little in common with Thornbury’s musings. And there are, of course, teaching methodologies based on helping learners “emerge”, although they don’t put it quite like that. Thornbury has done very little to critique SLA research, or to explain how all his “emerging” bits and pieces might help future research move in a better direction.Furthermore, nothing in his suggestions for teaching practice is new, and none of it depends on his “theoretical basis”.

Finally, let’s just have another look at this:

  • I think therefore I am: Wrong. Better:
  • I move therefore I am.
  • I speak therefore I move.
  • I move, therefore I learn.

Not exactly a syllogism, now is it? I speak therefore I move? Really?

And quite apart from being incoherent, how will it affect your understanding of SLA?  Still, at least the last sorry line might inspire you to get off your butt and revisit Asher – he of Total Physical Response.

The Negotiated Syllabus


I suggested in my last post that a real paradigm shift in ELT would involve throwing out the coursebook and standardised tests and replacing them with a process-driven approach which concentrates on the “how” more than the “what” of teaching. So far, so good. But I went further, and in fact, I went rather too far, and I now have to make amends. I suggested that the alternative paradigm was fundamentally defined by a process of negotiation between teacher and learners, and that’s not so. I didn’t actually spell out what this negotiation amounted to, and neither did I make it clear that there are some very good alternatives to the present coursebook-driven paradigm which don’t involve any such “fundamental” negotiation.

For example, Mike Long’s detailed proposal for task-based language teaching, while it’s certainly learner-centred, and while it rejects the “product” or “synthetic” or “Type A” syllabus and hence the use of coursebooks and standardised tests, doesn’t include negotiation with learners about what tasks will form the content of the course, since these are determined by an external needs analysis, and then converted by teaching experts into peadagogical tasks. Various forms of task-based syllabuses, including many designed for business, or academic, or nursing, or other special purposes, while they are neither synthetic nor coursebook-driven (relying on their own materials), do not actually fit the “negotiated syllabus” brief. Even Dogme expects the teacher to be responsible for most of the important decisions about course content and methodology. So I need to explain the negotiated syllabus here, before finally presenting my own suggestion for an alternative to the present coursebook-driven paradigm in ELT.

The negotiated syllabus is the most extreme alternative approach to ELT, the one which most radically challenges assumptions held by most teachers today, the one which really turns everything on its head. Not only does the negotiated syllabus throw out the coursebook, it throws out the traditional roles of teacher and learner too. What follows is a brief summary, which relies heavily on an article by Sofia Valavani from the Second Chance School of Alexandroupolis. 

Second Chance Schools

Valavani explains.

Facilitating the fight against illiteracy of adults, the Adult Education General Secretariat implements programmes through which adults who have dropped out of schools have the opportunity to improve their academic and professional qualifications, so that they can get more easily integrated in the labour market or have a second chance for the continuation of their studies. This action addresses adults who were not able to complete their initial compulsory education and aims at offering them a second chance for the acquisition of a study certificate of the compulsory education.

Second Chance Schools is, therefore, a flexible and innovative programme, based on learners’ needs and interests, which aims at combatting the social exclusion of the individuals who lack the qualifications and skills necessary for them to meet the contemporary needs in social life and labour market.

Theoretical considerations

Valavani cites Freire’s view of adult education (Freire 2000: 32) that personal freedom and the development of individuals can only occur mutually with others: Every human being, no matter how ‘ignorant’ he or she may be, is capable of looking critically at the world in a ‘dialogical encounter’ with others. In this process, the old, paternalistic teacher-student relationship is overcome. She adopts Friere’s pedagogy and his “education for freedom”, and proposes a Second Chance Schools syllabus which provides “grounding for a learner-centred syllabus, reintroducing the SCS learners as the key participants in the learning process.”


Following Breen and Littlejohn (2000), teachers and students negotiate together so as to reach agreement in four areas:

  1. Why? The purposes of language learning.
  2. What? The content which learners will work on.
  3. How? The ways of working in the classroom.
  4. How well? Evaluating the efficiency & quality of the work and outcomes.

These four areas of decision-making are expressed in terms of questions. The questions are listed in a questionnaire (best format probably multiple-choice or Likert-scale) and the answers are negotiated by the teacher and the learners together-

The Negotiation Cycle

The negotiation cycle is illustrated below (Breen and Littlejohn (2000, 284)


The syllabus identifies different reference points for the negotiation cycle in terms of levels in a curriculum pyramid. The figure below (Breen and Littlejohn, 2000, 286), illustrates these levels on which the cycle may focus at appropriate times.



Decisions range from the immediate, moment-by-moment decisions made while learners are engaged in a task, to the more long-term planning of a language course (and in the Breen & Littlejohn model, through to the planning of the wider educational curriculum). Together, the negotiation cycle and the curriculum pyramid represent a negotiated syllabus as negotiation at specific levels of syllabus and curriculum planning. Breen’s figure (ibid: 287) illustrates this, with the negotiation cycle potentially being applied to a particular decision area at each of the different levels in the pyramid.


In order to implement this design, a number of tools are needed, among them, the 4 below.

  1. Tools for establishing purposes: Initial questionnaires to learners, learning contracts, planning templates.
  2. Tools for making decisions concerning contents: A learning plan developed jointly by a teacher and learners; learner-designed activities; a materials bank including a wide variety of tasks, texts, worksheets, grammar work, etc..
  3. Tools for making decisions about ways of working.
  4. Tools to evaluate outcomes: Daily/ Weekly/ Monthly retrospective accounts, reflection charts, assessment (can-do) cards, work diaries, reflective learning journals, peer interviews, portfolios, one-to one consultations, etc..


This very brief outline gives a good idea of the principles involved, but doesn’t give us a good picture of what actually happens in the implementation of such a negotiated syllabus. I think that what’s most important is to see it as an extension of the task-based syllabus: tasks are what drive it, but the tasks are decided on by teacher and learners together. The negotiation part looms large, like a bogey man, but in fact 90% of the course would be dedicated to carrying out the tasks.  What we need to explore more is how the negotiation affects the selection, sequencing and evaluation of the tasks.  So, for example, at the start of the course, the teacher, having explained what’s going to happen, works through the first questionnaire which aims to make a plan for the first phase of the course, maybe the first 6 to 10 hours. The questionnaire is obviously vital here, as is the teacher’s ability to help members of the new group to articulate their views and find consensus. Objectives may  be quite broad (improve ability to communicate with people) or more specific (give a presentation in a business meeting), but  they have to provide a good idea of priorities in terms of “can-dos”, the 4 skills, etc.. The content at this stage is also broadly specified, but again, a general feeling for areas of interest is teased out, and, similarly, preferred ways of working are voiced and discussed.  What would the questionnaire look like? How long would be spent on discussing it and arriving at a plan?  What happens next?

My own version of this entails the teacher doing the first phase without any negotiation about the tasks to be done, in order to help the learners see the range of possibilities and get a feel for more “micro” levels of negotiation. In the next post, I’ll try to tackle some practical issues, flesh out the tools listed above, and suggest a syllabus for a group of lower intermediate students enrolled in a course of General English.


 Breen,M. and Littlejohn, A. (2000) The practicalities of negotiation.

Valavani, S. Negotiated syllabus for Second Chance Schools: Theoretical considerations and the practicalities of its implementation.

Are we on the brink of a paradigm shift in ELT?

Kuhn famously used the term “paradigm shift” to challenge the account given by philosophers of science such as Popper of how scientific theories evolved and progressed. Popper said that scientific progress was gradual and accumulative; Kuhn said it was sudden and revolutionary and involved paradigm shifts where one way of thinking was suddenly swept away and replaced by another. A paradigm shift involves a revolution, a transformation, a metamorphosis in the way we see something and it has profound practical implications. Change begins with a change in awareness and perception. Our perception is heavily influenced by our past and by social conditioning, and most of the time we go along with the paradigm view / normal science / the status quo / the theory taught at MIT / the prevalent narrative. But there are revolutionary moments in history when we prove ourselves to be capable of transforming and transcending the prevailing paradigms which so affect our lives, and I wonder if we are currently approaching a paradigm shift in ELT?

The present ELT paradigm has these characteristics:

  • Standard English is the subject taught.
  • Vocabulary and grammar are the subject matter of EFL / ESL.
  • SLA involves learning the grammar and lexicon of the language and practicing the 4 skills.
  • A product syllabus is used. This focuses on what is to be taught, and, to make the “what” manageable, chops language into discrete linguistic items which are presented and practiced separately and step by step in an accumulative way.
  • A coursebook is used. The coursebook is the most important element determining the course. It’s usually grammar-based and presents the chopped up bits of language progressively. Other material and activities aim at practicing the 4 skills.
  • The teacher implements the syllabus , using the coursebook. The teacher makes all day-to-day decisions affecting its implementation.
  • The students are not consulted about the syllabus and have only a small say in its implementation.
  • Assessment is in terms of achievement or mastery, using external tests and exams.

The rival view of ELT has very different characteristics:

  • Standard English is one variety of English; it is not the subject taught.
  • Texts (discourse) are the subject matter of EFL /ESL.
  • SLA involves the socially-mediated development of interlanguage.
  • A process syllabus is used. This focuses on how the language is to be learned. There’s no pre-selection or arrangement of items; objectives are determined by a process of negotiation between teacher and learners as a course evolves. The syllabus is thus internal to the learner, negotiated between learners and teacher as joint decision makers, and emphasises the process of learning rather than the subject matter.
  • No coursebook is used.
  • The teacher implements the evolving syllabus in consultation with the students.
  • The students participate in decision-making about course objectives, content, activities and assessment.
  • Assessment is in terms of low-stakes formative assessment whose purpose is “to act as a way of providing individual learners with feedback that helps them to improve in an ongoing cycle of teaching and learning” (Rea-Dickens, 2001).

If this rival view were to be widely-adopted in ELT it would certainly constitute a revolution, a complete paradigm shift. But will it happen? When one looks at the arguments for and against the 2 views of ELT sketched above, it’s difficult to escape the feeling that the current paradigm is becoming less and less defensible, in the light of increasing knowledge of the the SLA process; poor results of classroom-based ELT courses; poor morale among teachers (apart from suffering from bad working conditions and pay, most teachers are denied the freedom to teach as they’d like to); and the increasing viability of alternatives.

Doesn’t the alternative seem so much more appealing? What’s better, that course content grows out of the experiences of the learners and is based on topics which reflect their reality, or that it derives from a coursebook made in London or New York? What’s better, that conversational dialogue is the essential component of the course, or that the teacher talks most of the time, gives presentations about English and leads the learners through prefabricated activities? What’s better, that the teacher follows orders and carries out a plan made by somebody in London or New York, or that the teacher is given permission to build the course as it goes along, involving learners in all the important decisions concerning objectives, content, activities and assessment? From both the learners’ and the teachers’ point of view, which approach is likely to lead to higher levels of interest, motivation, energy, engagement and satisfaction? Which approach is likely to lead to better results?

And don’t the replies to criticism of those who promote the current paradigm add further weight to the alternative argument? I’ve discussed elsewhere how some of the leading lights in ELT respond to criticisms of the current paradigm, and I think it’s fair to say that none of them has offered any proper defence of it. The gist of the argument is that alternatives are “unrealistic” and that ELT practice under the present paradigm is slowly but surely improving. As Harmer puts it, unafraid as always of using a handy cliché, “tests are getting better all the time”.

Another supporter of the present paradigm, Jim Scrivener, shows how little importance he gives to any real examination of alternatives. Scrivener simply assumes that teachers must run the show and that “Made in the UK (or USA)” coursebooks and test materials should determine course objectives and content. Rather than question these two fundamental assumptions, Scrivener takes them as given and thinks exclusively in terms of doing the same thing in a more carefully-considered way. In Scrivener’s scheme of things, everything in the ELT world stays the same, but the cobwebs of complacency are swept away and everybody demands high (whatever that means). So teachers are exhorted to up their game: to use coursebooks more cleverly, to check comprehension more comprehensively, to practice grammar more perspicaciously, to re-cycle vocabulary more robustly, and so on, but never to think outside the long-established framework of a teacher-led, coursebook-driven course of English instruction. Recently Scrivener commented that a good coursebook is “a brilliant exploitable all-bound-up-in-one-package resource.” No attempt is made to argue the place of coursebooks in ELT, but Scrivener does take the opportunity to caution on the need for teachers to be trained in how to use coursebooks. Some teachers find reading pages of coursebooks (in the sense of appreciating the links between different parts of the page and pages) “baffling” and so they need to be shown how to “swim” in the coursebook, how to take advantages of all that it has to offer. Apart from giving the impression that he thinks he’s very smart and that most teachers are very dumb, Scrivener gives more evidence of the limits of his vision: nowhere does he discuss training teachers how to do without a coursebook, for example. After all, why on earth would anybody want to do that?

In the same discussion of coursebooks on Steve Brown’s blog, Scott Thornbury eloquently summarized the case against them. I cut and pasted his summary on this blog, leading Hugh Dellar to tweet “Shocking disdain for the craft of writers & editors, as well as the vast majority of teachers from @thornburyscott.” This is typical of Dellar’s response to criticism of coursebooks in two respects. First it is badly-written, and second it takes offence rather than offering any evidence or arguments to the contrary. Dellar has made a number of comments on my criticisms of the dominant role of coursebooks in current ELT, but none of them offers any argument to refute the claim that coursebooks are based on false assumptions and that a process syllabus better respects research findings in SLA, and represents a better model of education. In all the recent discussions of teaching methodology, the use of coursebooks, the design and use of tests, teacher training, and so on, both in the big conferences and in blogs, nobody who defends the current paradigm of ELT has properly addressed the arguments above or the arguments for an alternative offered by Richard Breen, Chris Candlin, John Faneslow, Mike Long, Rose Bard, Graham Crookes, Scott Thornbury, Luke Meddings, and many others. These are met with a barrage of fallacious arguments and very little else.

While I believe that those who fight against the current paradigm have the more persuasive arguments, not to mention the more exciting agenda, I unfortunately don’t believe that we’re on the brink of a paradigm shift in ELT. The status quo is too strong and the business interests that support and sustain this status quo and its institutions are too powerful. The alternative view of ELT described here is essentially a left-wing view which is just too democratic to stand a chance in today’s world. I suppose the best that those of us who believe in an alternative can do is to argue our case and make our voice heard. Whether or not to compromise is another important issue. I was interested to see Luke Meddings propose a 50-50 deal recently: “OK”, he suggested, “just put the book and the tests away for 50% of the time!” I don’t feel comfortable with that, but he might well be on the right track.