Friday, 28 March 2014

What IS somebody's IQ? Only an approximate measure - the example of Robert M Pirsig

*

IQ is not a precise measurement - especially not at the individual level, and especially not at the highest levels of intelligence when the whole concept of general intelligence breaks-down and there are increasing divergences between specific types of cognitive ability.

http://iqpersonalitygenius.blogspot.co.uk/2012/08/problems-with-measuring-very-high-iq.html

There is a tendency to focus upon a person's highest-ever IQ measure - for example in the (excellent!) philosophical novel Zen and the Art of Motorcycle Maintenance the author Robert Pirsig notes the startling fact (and it is a fact) that his (Stanford-Binet) IQ was measured at 170 at the age of nine - which is a level supposedly attained by one in fifty thousand (although such ratios are a result of extrapolation, not measurement).

*

But an IQ measure in childhood - even on a comprehensive test such as Stanford Binet, is not a measure of adult IQ - except approximately (presumably due to inter-individual differences in the rate of maturation towards mature adulthood).

A document on Pirsig's Wikipedia pages (Talk section) purports to be an official testimonial of Pirsig's IQ measurements from 1961 (when he was about 33 years old) and it reads:

*

UNIVERSITY OF MINNESOTA
COLLEGE OF EDUCATION
MINNEAPOLIS 14
 
 INSTITUTE OF CHILD DEVELOPMENT AND WELFARE
  
   June 14,1961
  
   To Whom it May Concern:
  
   Subject: Indices of the Intellectual Capacity of Robert M. Pirsig
 
Mr. Pirsig was a subject in one of the institute’s longitudinal research projects and was extensively evaluated as a preschool, elementary, secondary, college and adult on various measures of intellectual ability. A summary of these measures is presented below.
 
Childhood tests: Mr. Pirsig was administered seven individual intelligence tests between the ages of two and ten. He performed consistently at the 99 plus percentile during this period.
 
His IQ on the Stanford Binet Form M administered in 1938 when he was nine and a half years old was 170, a level reached by about 2 chilldren in 100,000 at that age level.

In 1949 he took the Miller's Analogy at the Univer. of Minn.. His raw score was 83 and his percentile standing for entering graduate students at the University of Minnesota was 96%tile.
 
In 1961 he was administered a series of adult tests as part of e follow up study of intelligence. The General Aptitude Test Battery of the United States Employment Service was administered with the following results:
  
   General Intelligence .......99 % ile
  
   Verbal Ability .............98 % ile
  
   Numerical Ability ..........96 % ile
  
   Spacial Ability ............99 % ile
  
  
   John G. Hurst, PhD   Assistant Professor

*

So, as well as the stratospheric IQ 170, there are other measures at more modest levels around 130 plus a bit (top 2 percent).

Of course there may be ceiling effects - some IQ measures don't try to go higher than the top centile.

But still, lacking that age nine test - and most nine year old's don't have a detailed  IQ personal evaluation - Pirsig's measured IQ would be quoted at about around one in fifty or one in a hundred - rather than 1: 50,000.

Ultra-high IQ measures must be taken with a pinch of salt; because 1. at the individual level IQ measures are not terribly reliable; 2. high levels of IQ do not reflect general intelligence, but more specialized cognitive ability; and 3. even when honest, the number we hear about may be a one-off, and the highest ever recorded from perhaps multiple attempts at many lengths and types of IQ test.

*

Thursday, 27 March 2014

What is Working Memory? How does it relate to general intelligence? Speculations...

*

NOTE ADDED 15 MAY - I HAVE COME TO THINK THE BELOW CONCEPT OF WORKING MEMORY IS WRONG, OR AT LEAST USELESS AND MISLEADING. INDEED I WONDER WHETHER WM MAY BE A FUNDAMENTALLY FLAWED CONCEPT?... I SHALL PROBABLY DELETE THIS POST SOMETIME, BUT FOR THE PRESENT SHALL LEAVE IT - WITH THIS WARNING.

Some time ago I was very interested by 'Working Memory' - trying to understand how to conceptualize it, what its structural brain basis might be - and also to measure it in this study:

http://www.hedweb.com/bgcharlton/tina-fry.html

The concept of Working Memory played a very large role in my book Psychiatry and the Human Condition - published in 2000.

http://www.hedweb.com/bgcharlton/psychhuman.html

Here is an excerpt from Psychiatry and the Human Condition describing how I visualized WM c 1999 - other discussions can be found by word-searching the phrase 'Working Memory'.

*

Working memory (WM) is the site of awareness, located in the prefrontal lobe of the cerebral cortex. WM functions as an integration zone of the brain, where representations from different systems converge and where several items of thought to which we are attending can simultaneously be sustained and manipulated. When we deliberately grapple with a problem and try to think it through - this process is happening in working memory, when we are aware of something it is in working memory, when we wish to attend to a specific stimulus, we represent it is WM...


Awareness comprises attention and working memory (WM). To be aware of an perception it must be selectively attended to, and the representation of that entity must be kept active and held in the brain for a length of time adequate to allow other cognitive representations to interact with it and in a place where other cognitive representations can be projected. Working memory is such a place, a place where information converges and is kept active for longer than usual periods. Hence working memory is the anatomical site of awareness.

The nature of working memory can be understood using concepts derived from cognitive neuroscience. Working memory is a three-dimensional space filled with neurons that can activate in patterns. Cognition is conceptualized as the processing of information in the form of topographically-organized (3-dimensional) patterns of neural activity called representations - because each specific pattern ‘represents’ a perceptual input. So that seeing a particular shape produces a pattern of cell activation on the retina, and this shape is reproduced, summarized, transformed, combined etc in patterns of cell activation in the visual system of the brain - and each pattern of brain cell activation in each visual region retains a formal relationship to the original retinal activation.

Representations are the units of thinking. In the visual system there may be representations of the colour, movement and shading of an object, each of these having been constructed from information extracted from the original pattern of cell activation in the retina (using many built-in and learned assumptions about the nature of the visual world). The propagation and combination of representations is the process of cognition.

Cognitive representations in most parts of the brain typically stay active and persist for a time scale of the order of several tens of milliseconds. But in working memory cognitive representations may be maintained over a much longer time scale - perhaps hundreds or thousands of milliseconds - and probably by the action of specialized ‘delay’ neurons which maintain firing over longer periods. So WM is a 3-D space which contains patterns of nerve firing that are sustained long enough that they can interact with other 'incoming' patterns. This sustaining of cognitive representations means that working memory is also a ‘convergence’ region which brings together and integrates highly processed data from several separate information streams.
Any animal that is able selectively to attend-to and sustain cognitive representations could be said to possess a WM and to be 'aware' - although the content of that awareness and the length of time it can be sustained may be simple and short. The capacity of WM will certainly vary between species, and the structures that perform the function of WM will vary substantially according to the design of the central nervous system. In other words working memory is a function which is performed by structures that have arisen by convergent evolution, WM is not homologous between all animals that possess it - presumably the large and effective WM of an octopus is performed by quite different brain structures from WM in a sheep dog, structure that have no common ancestor and evolved down a quite a different path. The mechanism and connectivity of the human WM allows cognitive representations from different perceptual modalities or from different attended parts of the environment to be kept active simultaneously, to interact, and to undergo integration in order that appropriate whole-organism behavioral responses may be produced.
Working memory is reciprocally-linked to long term memory (LTM), such that representations formed in WM can be stored in LTM as patterns of enhanced or impaired transmission between nerve cells (the mechanism by which this occurs is uncertain but probably involves a structure called the hippocampus). So temporary patterns of active nerves are converted to much more lasting patterns of easier or harder transmission between nerves. The patterns in LTM may be later recalled and re-evoked in WM for further cycles of processing and elaboration.
This is how complex thinking gets done - a certain ,maximum number of representations can interact in WM in the time available (maybe a couple of seconds). So there is a limit to what can be done in WM during the span of activation of its representations. To do more requires storing the intermediate steps in reasoning. The products of an interaction in WM can be summarized (‘chunked’) and ‘posted’ to LTM where they wait until they are need again. When recalled and reactivated these complex packaged representations from LTM can undergo further cycles of interaction and modification, each building up the complexity of representations and of conceptual thought.
WM is therefore conceptualized as a site for integration of attended perceptual information deriving from a range of sensory inputs. Awareness seems to be used to select and integrate relevant inputs from a complex environment to enable animals to choose between a large repertoire of behavioral responses. There is a selective pressure to evolve WM in any animal capable of complex behavioural responses to a complexly variable environment. So the cognitive representations in WM in non-conscious animals are derived from external sensory inputs (eg. vision, hearing, smell, taste and touch).
The critical point for this current argument is that non-conscious animals may be aware of their surroundings, but they lack the capacity to be aware of their own body states. Awareness of outer environment is common, but awareness of inner body states is unique to conscious animals.
**
Working Memory Revisited in light of IQ
But I now need to go back and revisit my old understanding of Working Memory in light of my more recent understanding of general intelligence - because when I wrote Psychiatry and the Human Condition I knew essentially nothing about IQ. That such a thing can happen has at least two causes - the first is my own obtuse ignorance, the second that when I did try to tackle intelligence I was put-off by the fact (and it is a fact) that nearly-all psychometricians (including many of the best and most famous) are non-biological and non-evolutionary in their basic mode of thinking.
And this is true even when psychologically-trained psychometricians were writing about biology and evolution - it was (and is) obvious that they fundamentally hadn't a clue! This is a matter of training, especially early training - and that fact that traditionally psychology was taught in isolation from biology, hence evolutionary theory - and from medicine - in a weird No Man's Land of proliferating ad hoc theories and slavish devotion to arbitrary methods and statistics.
*

Working Memory versus Intelligence 
My understanding is that Working Memory and Intelligence are conceptually different, serve somewhat different functions, and are dissociable - such that a person may have higher than average intelligence and lower than average WM - or vice versa.
Intelligence is, roughly, a measure of the speed of processing (which may be roughly equivalent to efficient connectivity) while WM is, roughly, the size of the active 'workspace' -  presumably constrained by the anatomical size of the effective working memory zone and the fact that the relevant nerve cells can only be activated for a timescale of a few seconds.
So, Working Memory might be visualized as the 3D size of the space in which processing occurs - that which is processed may be visualized as the interaction of complex 3D shapes which represent the content of thought (ideas, perceptions, emotions etc) - and intelligence is the speed with which all this happens.
High intelligence means that more interactions can occur within a given size (and duration) of Working Memory; while a larger WM means that for a given level of intelligence, more things can be thought-about simultaneously.
*
(Something to flag-up: In a computer analogy, intelligence might be equivalent to the speed of a microprocessor in terms of the efficiency and complexity of its circuitry; while working memory might be equivalent to a microprocessor's Cache Memory. So intelligence is how quickly the microprocessor can do operations; while WM represents the amount of information which can be included in active processing possible at a given time. But I will leave critique and development of that analogy to those who know more about computers than I do - which is nearly-everybody.)

*

So, the highest level of thinking - such as creative genius - would seem to need both a high intelligence and also a large capacity Working Memory; since the WM would allow a person to hold many things simultaneously in-mind, including emotional evaluations - while high intelligence would allow these things to interact complexly within the short time-frame of WM.
*

The combination of WM and Intelligence can be regarded as processing power, and it can be seen that various combinations can lead to the same overall power. 

For example, to use some simple numbers to give an idea of ratios - 

An IQ of 3 and a WM of 2 - compared with an IQ of 2 and a WM of 3:

The WM number represents the complexity of content, while the IQ number represents the number of iterations of processing. 

*

So, when the WM is half as much again (3 compared with 2) then that means more possible combinations between the items being processed; and when when the IQ is increased by half - then there are 150% more iterations of processing.  

A WM of 3 and IQ of 2 might be 2 iterations of 3 --

2 X 3 = 6 as a number representing power.


By contrast a lower WM of 2 and IQ of 3 has one and a half time the number of iterations - therefore three iterations instead of two, but with a lower WM number to represent less complex content --

2 X 2 X 2 = 8 as a number representing power

*

So, in percentage terms, IQ (general intelligence) seemingly has a greater influence on processing power, because it allows more iterations of processing.

However, the quality of thinking of a WM 3/ IQ 2 will be different from a WM 2/ IQ 3 - I would guess that relatively higher intelligence would lead to a more linear, narrow and logically-extrapolative style of thinking; while a relatively higher WM would I think have a more associative style - better at multifaceted judgment.

All speculative stuff! But let's see where it leads...

*






Monday, 24 March 2014

Researching the decline of intelligence measured by reaction times. Where next?

*

Now that the approximate magnitude of previously estimated simple reaction times slowing over the past century or so has been confirmed

http://iqpersonalitygenius.blogspot.co.uk/2014/03/further-evidence-of-significant-slowing.html

the evidence of a significant decline in intelligence since Victorian times can no longer be dismissed or ignored.

So, the question arises what next?

1. Other researchers than myself and Michael A Woodley's group need to get involved. So long as all the results come from one group, uncertainty remains, Independent testing/ replication should be attempted.

2. While there is probably no more historical reaction time data to be had; the LPC-SO method of comparing longitudinal with cross sectional data could be applied to further samples of simple reaction times - and to other possible objective measures correlated with general intelligence and with plausible biological links to intelligence.

3. Further historical data on the effects or outcomes of intelligence may be discovered, to supplement Woodley's reanalysis of innovation rates and the incidence of creative geniuses. Any quantifiable human activity or achievement which depends strongly upon intelligence ought to show evidence of decline in-line with slowing simple reaction times.

Historical variability in heritable general intelligence: Its evolutionary origins and socio-cultural consequences. MA Woodley, AJ Figueredo - 2013 - University of Buckingham Press.

4. The effects of normal ageing on simple reaction times needs to be known with more precision - age of onset, shape of curve, sex differences and so on.

5. The quantitative relationship between simple reaction time and currently-measured IQ needs to be known - so as to make a valid conversion formula. The sRT IQ correlation coefficient is too low to make such a formula useful for individuals, but in terms of group averages it could be valuable. Such a conversion formula might turn out to be non-linear - and opens the possibility of measuring (group) intelligence on an interval or even a ratio scale.

http://iqpersonalitygenius.blogspot.co.uk/2013/02/the-ordinal-scale-of-iq-could-be.html

6. At a deeper level, an understanding the relationship between general intelligence and reaction times needs development - in particular, can 'g' be coherently defined terms of the objectively measurable speed of processing? What is the minimum possible sRT? What is the effect of slowing sRT on intelligence in terms of interactions with other cognitive constraints?

7. Assuming it is agreed that intelligence has declined very substantially over the past 150 years or so; then the mechanism of this decline needs elucidation - since the rate of decline seems to be faster (maybe even twice as fast?) than the rate predicted by the differential reproductive success of people with different IQ. My preferred explanation of intelligence being damaged by the generation-upon-generation accumulation of novel deleterious mutations (mutations which would, through most of history have led to a high probability of early death during childhood - and thereby the filtering of such mutations from the gene pool).

*

Tuesday, 18 March 2014

Further evidence of significant slowing of reaction times, and decline of intelligence, over recent decades in the UK: a method comparing longitudinal prediction with cross-sectional observation (LPCSO)

*

This is a re-analysis of the data from Deary IJ, Der G. Reaction time, age and cognitive ability: longitudinal findings from age 16-63 years in representative population sampless. Aging, Neuropsychology and Cognition. 2005; 12: 187-215.

The principle is that data on the longitudinal slowing of simple Reaction Times (sRTs) measured in longitudinal follow-up studies of segments through the human lifespan, may be interpolated to predict the expected slowing in sRT between 16 and 63; the prediction is compared with the actual measured sRT in cross-sectional studies at ages 16 and 63. 

In other words, the longitudinal data is used to construct an 'ageing curve' which describes the expected slowing of reaction times through the lifetime of an average person. But it was found that the measured reaction times of elderly people were considerably faster than would be expected from the effect of ageing of the youngest cohort - consistent with a generation by generation slowing in sRT. 

The difference between the predicted and observed sRT of elderly people is a measure of the slowing of sRT (ie. 'secular' change, or presumed dysgenic change) over the span of 47 years - and this can be extrapolated to estimate the slowing of sRT expected over longer periods (assuming that the rate of sRT slowing is constant).

This Longitudinal Prediction Cross-Sectional Observation (LPCSO) method predicts a slowing of sRT of about 80 ms in a century, which is very similar to the measured difference of 70ms slowing between Victorian and modern sRT. 

This confirms the very substantial slowing of sRT since the 1800s, from about 180 ms to about 250 ms (a slowing of about one or more standard deviations of modern sRT) which must surely orrespond to a significant decline in general intelligence.

The LPCSO method could be used on other sRT data sets to check this result - and also applied to other possible measures of dysgenic or secular trends.




I use the longitudinal data (16-24, 36-44, 56-63) from women only (see note below) to generate three measures of declining sRT expressed as a slowing of ms/year.

16-24 slows from 295-306 = 1.375 ms/year for 8 years
36-44 slows from 315-332 = 2.125 ms/year for 8 years
56-63 slows from 345-375 = 4.286 ms/year for 7 years

To interpolate the declines from age 34-46 and from 44-56 I simply added the rates of decline from either side and halved it - so:

24-36 slows at a rate of 1.750 ms/year
44-56 slows at a rate of 3.206 ms/year

each of these gaps is 12 years (longer than the 7 or 8 years of the longitudinal studies), so the rate per year is multiplied by 12 years.

So to make the graph we have six points with the following amount of slowing - in ms:

16-24 - 11ms
24-36 - 21ms
36-44 - 17ms
44-56 - 39ms
56-63 - 30 ms

Total predicted decline in sRT from 16-63 = 118 ms - starting from 295 ms age sixteen this would lead to expected sRT of 413 ms at age 63

Predicted sRT at each age
          Age         msec
16 295
24 306
36 327
44 344
56 383
63 413


But the actual measure sRT age 63 was 375ms 

Difference between expected and observed simple RT is 413 ms - 375 ms = 38 ms in 46 years 

- which is an extrapolated slowing of sRT of about 80 ms in a century.




















Circles and dotted line = measured sRT in the three different age cohorts
Crosses and solid lines = predicted sRT with increasing age.


This is much the same as the amount of decline detected in the previous study, which measured approx 70 ms of slowing between the Victorian and modern sRTs.

http://iqpersonalitygenius.blogspot.co.uk/2012/08/objective-and-direct-evidence-of.html

The next step is to locate other data suitable for this 'LPCSO' method of comparing the longitudinal-prediction of change with cross-sectional-observations between different generations, to test this estimate and to look at other potential variables.  

[Note: the above is a partial and preliminary version of a paper currently in preparation with several other authors.]

**


Note on the decision to analyze only the female data, and to exclude the male data.


This is to clarify that the exclusion of Male data from the above Deary & Der re-analysis, and the decision to focus only on the Female subjects in this study, was a prior exclusion done before embarking upon analysis; and therefore not a post hoc decision performed after analysis.

The youngest male age cohort is 16-24; and I have been convinced by the work of Richard Lynn that men matured in terms of IQ significantly later than women, and that average native British men were probably not mature until at least age 18. 

I therefore decided to omit the 16-24 age group from analysis, on the assumption it would contain a significant proportion of cognitively-immature men whose IQ had not reached the maximum level (thus the 16-24 group would potentially contain people whose IQs were still rising, and others who had begun to decline from ageing – this obscuring the effect of ageing).

By contrast, my understanding was that a large majority of women would have reached cognitive maturity (and maximum IQ) by age 16 – so there was no problem with including women from the 16-24 age cohort.

Having decided to delete the age 16-24 cohort for the men, I was left with only four male data points for reaction times – and therefore just a single internal comparison for the analysis of longitudinal versus cross-sectional change – i.e. that comparison bounded by the 46-54 and 56-63 age cohorts. A graph made of just 4 points spread over just 27 years seemed clearly inadequate for the analysis I envisaged; and there was no internal replicate for the predicted versus actual change in reaction times between cohorts.

Therefore I discarded the male data and analyzed only the females.


*

Friday, 14 March 2014

Creativity is invisible, deniable, inevitably misunderstood - yet vast in impact

*

Creativity is, in practice, culturally invisible - although its impact may be seismic.

This is best seen in technologies - where the effects are most apparent and where the archaeological and historical record is of most value. 

*

The great mass of truly creative breakthroughs in history are unattributed - the men who made them are forgotten, their names were not attached to their creative acts.  This enables credit to be reassigned to 'the folk' or 'culture' - but all actually known-about breakthroughs seem to be attributable to one, or at most two, men.

*

Creative breakthroughs are extremely difficult and rare - as is shown by the centuries, perhaps even millennia, of stasis which are then suddenly broken by simple breakthroughs - bow and arrow, arch, stirrup, new shapes of plough.

As soon as the breakthrough has been made into an artifact, then it is obvious - many people can understand it, many people can make it, and almost everybody can use it.

Why give special credit to someone just for discovering something obvious?  

So, once the creative breakthrough has been made, by one unattributed man perhaps, its effects can rapidly spread, even across the whole world - human life may be transformed by a single anonymous breakthrough.

*

Anonymous creative breakthroughs are a sufficient basis for mass cultural change. The mis-match between the obscurity of the individual creator and the vast consequences of that breakthrough really cannot be exaggerated. 

Yet many or most cultures show no evidence of any creative breakthroughs at all - presumably because they utterly lacked creative people. These cultures had sufficient ability to manufacture, train and use technologies of a certain type - and to pass on that knowledge between generations in a stereotypical fashion - but no more.

That is the norm for human history. That is the situation for most people who have ever lived.

*

So, creative breakthroughs are almost always deniable. As soon as the breakthrough has been made, within minutes perhaps, the extraordinarily rare and special nature of its occurrence is deniable.

Indeed, creativity is deniable largely because it is so rare - few can appreciate that which they cannot do. Alternative explanations are almost-always preferred - creativity is almost always explained-away - especially by the perennial and utterly false cry: 'but it was obvious!'

*  


Sunday, 9 March 2014

What is the potential evidence AGAINST a one standard deviation decline of intelligence over the past 150-200 years?

*

Here is a list of some objections to and evidence against the assertion that average Victorian IQ would have been measured at one SD higher than moderns - that is at a modern IQ of 115 or more.

My comments follow [in square brackets]

*

1. The decline of intelligence is too fast to be accounted for by known mechanisms related to differential reproductive success between the most and the least intelligent people.

[I agree, that mechanism only accounts for about half the rate of decline required to produce 1 SD slowing in simple reaction times, hence intelligence. Another mechanism, or more than one extra mechanism, is required. I favour the accumulation of deleterious (intelligence damaging) mutations generation upon generation, due to very low child mortality rates since 1800, compared with all previous times in history.]

*

2. A 1 SD decline in intelligence since Victorian times would lead to a collapse of high level intellectual activity such as the number of creative geniuses and the rate of major innovations...

[I agree - it would lead to collapse...]

but this collapse has not happened - therefore there cannot have been a 1 SD decline.

[But my interpretation is that collapse has happened: the number of creative geniuses has collapsed and so has the rate of major innovations. Unless we are fooled by hype, or the self-interested self-promotion of insiders, I think this collapse is very obvious indeed across the whole of Western culture. I was writing about this collapse for many years before I came across the evidence of reducing intelligence - but I was trying to explain it in other ways such as the decline in scientific motivation, honesty, institutional factors, modern fashions, bureaucratization, Leftism etc. But the data for intellectual collapse are solid: what is in dispute are the best explanations.]

*

3. Intelligence has been rising, not falling, in developed countries - as evidenced by the rising average IQ test scores - a phenomenon usually called The Flynn Effect.

[I agree that average IQ test scores rose through the twentieth century - but that this was a matter or rising test scores; meanwhile average intelligence was declining. In other words, test scores were subject to inflation - or more accurately stagflation: as when prices are rising but economic production is declining. IQ test scores were rising, but real intelligence was declining.]

*

4. The evidence of slowing simple reaction times is not valid, because measurements and sampling methods in Victorian times are too different from modern measurement and sampling methods.

[Michael A Woodley and I have argued that these micro-methodological quibbles are inappropriate and invalid - and I think we have refuted them.]

*

5. Simple reaction times are not a sufficiently accurate, or valid, measurement of intelligence. In fact the idea that reaction times measure intelligence is obvious nonsense, because the best fist fighters and athletes have the quickest reactions, so they would have to be the most intelligent people - but they aren't...

[Simple reaction times are nothing to do with what the general public thinks of as 'quick reactions', and nothing to do with athletics, sports, or that kind of thing. Since the mid 1800s it has been known that differences in simple reaction time - such as seeing a light flash and pressing a button, are correlated positively with differences in intelligence. The correlation is not very tight, there is a lot of scatter around the line, but there always is a correlation - and average sRT differences accurately predict measured intelligence differences between both individuals and groups such as class, sex and race. Nobody who knew the field disputed the robust correlation between sRT and IQ - and many of the main scholars (such as Jensen) have assumed that the reason for the correlation was causal - that sRT reflects speed of neural processing which is a fundamental aspect of general intelligence. It is dishonest scientific practice to overturn more than a century of good research just because the sRT results go in a direction that you find surprising.] 

*

6. One SD slowing in sRT does not necessarily imply a 15 point reduction in IQ.

[I agree, because IQ is not a 'real' interval scale - which means that the difference in intelligence measured by 1 IQ point is not known and is presumably varied at different points in the scale. Reaction time is, however, an interval scale - measured in milliseconds. I have assumed that therefore sRT should take priority as the most valid scale and IQ should be calibrated against sRT. Therefore I argue that if sRT has slowed by about one SD then this should be understood to mean one SD decline in real intelligence. ]

*

7. An sRT slowing of about 70 milliseconds between the 1880s and nowadays may average at about 1 IQ point per decade, but this does not necessarily imply a linear rate of decline - the rate of change may vary.

[I agree. The actual rate of decline will depend on the main causes of decline. This is not known. Indeed, if I am correct that a generation upon accumulation of deleterious and intelligence-damaging  gene mutations is an important factor - the way that this works is not known. My feeling or hunch is that this kind of effect would not be linear but that the incremental amount of damage would increase with each generation - perhaps exponentially or by some other accelerating rate. So that if there were 2 new deleterious mutations per generation, then 4 would be more than twice as harmful as 2; and 8 would be more than twice as harmful as 4 - and so on. So the rate of decline of intelligence (and slowing of sRT) over 150 years need not be linear - but I would guess it is accelerating.]

*

8. There is just not enough evidence. One historical study with not very many data points is not enough to overturn the consensus from the Flynn effect studies that intelligence is rising.

[Fair point - except that the current consensus is not very secure - since confidence that rising IQ test scores really means rising 'g' (general intelligence) has never been very high. But on the other hand, the sRT historical evidence of declining intelligence is too strong to ignore. The best response is to seek further methods of confirming the decline in intelligence using different data and methods. That is what Michael A Woodley and I are doing, as best we may - but it would be great to have other people also working on the problem.]

*

Saturday, 8 March 2014

What do YOU believe about the reported slowing of average simple Reaction Times and the (?One Standard Deviation) decline in intelligence since Victorian times


*

1. Do you believe that Victorian simple reaction time (sRT) data are not comparable with modern data? If so, would you be convinced by evidence of rapidly slowing reaction times over recent decades, measured in one laboratory and using only modern RT machines? Because this kind of evidence is in the pipeline.

2. Do you believe that - despite about 140 years consensus that sRT and IQ are significantly correlated, and the general belief that this correlation is because general intelligence is dependent upon processing speed of which sRT is an indirect measure - there is NOT a causal relationship between simple reaction times and intelligence? That, therefore, average sRTs could be getting much, much slower but that this would not necessarily make any difference to average intelligence?

3. Do you believe that the measured decline in average simple reaction time from a Victorian sRT average speed of about 180 milliseconds (in several independent studies) to a modern average speed of 250 milliseconds (or slower), a slowing of 70 milliseconds plus - is not enough to be of interest: that it is too small to not reflect any significant or meaningful reduction in intelligence.

4. Do you believe that because the measured slowing of sRT over the past 150 years seems unexpected, and is larger than you would have supposed possible, strikes you as indeed ludicrous - that therefore we should simply ignore it?

5. Do you believe that - because the data on long term sRTs seems anomalous with your world view, that we should therefore assume that somehow there is something wrong somewhere with the Victorian to Modern comparison; and therefore we should just carry-on just as if we knew nothing about longitudinal changes in sRTs?

6. Do you believe that there has been a significant reduction in average general intelligence over the past 150 years, but that it is much less than one standard deviation - probably more like HALF a standard deviation? And the large size of the sRT slowing is just a Red Herring?

7. Do you believe that average intelligence has NOT changed over the past 150 years - that moderns have the same intelligence as Victorians? And the slowing of average sRT is irrelevant?

8. Do you believe that average intelligence has increased over the past 150 years despite slowing of sRTs, because you believe the pen-and-paper IQ tests are more valid, reliable and/or objective than reaction time data?

9. Or something else, or what?

*

Greg Cochran, slowing of simple reaction times and the 1SD decline in intelligence over the past 150-200 years

*

Greg Cochran has been the most significant (intellectually substantial) critic and opponent of the idea (deriving from myself and Michael A Woodley) that historical reaction time data have shown a significant (approx. one standard deviation or 15 modern IQ point) decline in intelligence since Victorian times. 

[http://charltonteaching.blogspot.co.uk/2012/06/taking-on-board-that-victorians-were.html]

In his latest blog posting, Greg takes another side swipe at the idea.

http://westhunt.wordpress.com/2014/03/05/outliers/

Here is my comment in response.

*



@Greg

As you presumably know, I have an extremely high regard for your work (e.g. having provided a back page blurb for 10,000 Year Explosion and invited you to write for Medical Hypotheses on the germ theory of male homosexuality).

And I am – on the whole! – grateful for your opposition to the finding of an approximately 1SD (15plus IQ points by modern measurements) decline in general intelligence in England (and similar places) as measured by simple reaction times since about 150-200 years ago – grateful because it has stimulated me to organize my thoughts on the subject.

But I continue to think you are wrong! and that the evidence you bring against this decline is inadequate – so I continue to hope to persuade you otherwise.

I have three considerations to offer.

*

1.       The decline in question is (roughly) from IQ 115 to IQ 100 over the space of 150 years – about one IQ point per decade (whatever that means!). But I suggest that this would not be expected to have analogous functional consequences to a decline from 100 to 85, since IQ is not an interval scale.

(In a nutshell, I think Victorian English IQ was *about* the same or a little more than recent Ashkenazi IQ – but has declined.)

This 150 year decline measure in modern IQ units corresponds to a slowing of simple reaction times from approximately 180 to 250ms for men – about 70 milliseconds.

And the minimum RT in the Victorian studies was about 150 ms – which is probably near the physiological minimum RT (and maximum real underlying IQ) constrained by the rate of nerve transmission, length of nerves, speed of synapse etc.

So average Victorian RT was about 30 ms above minimum RT, while modern RT is about 100 ms above minimum.

By contrast – modern reaction times (in Silverman’s study) for men average approximately 250ms with a standard deviation of 50ms – however there are good recent studies with an average RT of 300ms for men.

I would argue (on theoretical grounds) that as RT declines there ‘must’ come a point when it comes-up-against the neural constraints of intelligence, such as short term/ ‘working’ memory (the metal ‘workspace’, activation of which lasts a few seconds, seemingly) – and therefore there would be a non-linear effect of reducing intelligence – intelligence would cross a line and fall off a cliff.

My assumption is that a reduction in (modern normed) IQ from average 115 to 100 would *not* have such a catastrophic effect on high level intellectual (abstract, systemizing) performance as a reduction from average 100 to 85. (At a modern average IQ of 85, top level intellectual activity is *almost* entirely eliminated.)

When we are dealing with the intellectual elites, the same may be more apparent – the initial reduction in RT may retain the possibility of complex inner reasoning; while after a certain threshold the number of possible operations in the mental workspace would drop below the minimum needed for high scale intellectual operations.

*

2.       It may be that your example of maths does not refute the observation of reduced intelligence. It may be that modern mathematic breakthroughs are of a different character than breakthroughs of the past – and do not require such high intelligence.

I think this may be correct in the sense that I get the impression that modern maths seems to be substantially a cumulative, applied science – somewhat akin to engineering in the sense of bringing to bear already existing techniques to solve difficult problems.

So a top level modern mathematician has (I understand) spent many years of intensive effort learning a toolbox of often-recently-devised methods, and becoming adept at applying them, and learning by experience (and inspiration) where and how to apply them.

This seems more like the Kuhnian idea of Normal Science than the Revolutionary Science of the past – more like an incremental and accumulative social process, than the individualistic, radical re-writings and fresh starts of previous generations. And, relevantly, a method which does not require such great intelligence.

I also note that many other sciences, from biology to physics, have observed the near-disappearance of individual creative genius over the past 150 years – and especially obviously with people born in the past 50 or so years - which would be consistent with reducing intelligence.

*

3.       Michael Woodley and I have discovered further independent – but convergent – evidence consistent with about 1 SD (15 IQ point) decline in intelligence from Victorian times, again using simple reaction time data – but, as I say, using a completely different sample and methods. The paper is currently under submission.

I mention it because the unchallenged consensus post-Galton has been that simple reaction times has some causal – although not direct – relationship to intelligence; and if we have indeed established that RT has substantially slowed over recent generations, then either this would need to be acknowledged as implying a similarly substantial decline in intelligence – or else the post-Galton consensus of IQ depending on RT would need to be overturned.

**

Note added: An e-mail correspondent writes:

Take the following claim of Cochran's:

"In another application – if the average genetic IQ potential had decreased by a standard deviation since Victorian times, the number of individuals with the ability to develop new, difficult, and interesting results in higher mathematics would have crashed, bring [sic] such developments to a screeching halt. Of course that has not happened."

Cochran is completely correct in his reasoning, and in his prediction that higher mathematics would have crashed given a one sigma decline in g. His last sentence is however empirically false, because a crash is precisely what the data indicate happened.

Charles Murray, in his 2003 Human Accomplishment presents graphic data of the rate of eminent mathematicians and major accomplishments in mathematics (p. 313). The trends reveal a precipitous decline in the occurrences of both of these between the years 1825 and 1950. Extrapolating the decline in this period out to the year 2000 would place the rate of eminent mathematicians and their accomplishments below the rate observed in 1400, despite massive population growth in the West during this interval. The peak of mathematical accomplishment clearly occurred during the heyday of eugenic fertility in the West, between 1650 and 1800, and actually occurred earlier than the peaks experienced in other areas of science and technology, perhaps suggesting greater sensitivity to shifting population levels of g (a testable prediction incidentally).

These data completely concur with my sense that modern 'mathematics' has stagnated. There are virtually no valid proofs being offered for the long-standing mathematical problems these days. Six of the seven Millennial prize problems remain unsolved. More worrying still, no one seems to have grasped the enormity of the problem posed to the foundations of mathematics by Georg Cantor's work on transfinite numbers, and we are no closer to understanding how these fit into the foundations of mathematics today than we were in the 1900's.

The two greatest mathematicians alive today are Andrew Wiles, who solved Fermat's Last Theorem, and Grigori Perelman, who amongst other things, solved Poincare's Conjecture (the only Millennial prize problem to have been unambiguously solved thus far). Of the two of these, Perelman is the only one who would compare favorably with the great mathematicians of the past. Wiles, whilst having undoubtedly made a major discovery, is clearly second rate by historical standards, as he had to marshal enormous amounts of time and effort into solving just one problem, which was not completed until he was more than 40 years old - an achievement pattern atypical of great mathematicians who typically reach peak accomplishment at less than 35 years of age.

That leaves Perelman, who has been prodigious and productive from a  relatively early age. He is nothing if not scathing about the state of modern mathematics either, having claimed the following in a 2006 interview on why he turned down various prestigious mathematics prizes:

"Of course, there are many mathematicians who are more or less honest. But almost all of them are conformists. They are more or less honest, but they tolerate those who are not honest."

This could of course equally well apply to every area of scientific inquiry in the modern world. Data, such as that presented by Murray and others clearly reveal that what you have today are hoards of 'mathematicians' who are collectively not one iota as accomplished as the relatively less numerous, but vastly more talented individuals who dominated this field in centuries past.

Just because these over-promoted self-promoters claim something is 'interesting', 'new' or even a 'breakthrough' in their field doesn't make it so - the decline in eminence in point of fact makes it antecedently highly implausible that 'mathematicians' today are even capable of generating anything approaching a breakthrough (ultra-rare individuals such as Perelman and Wiles excepted).


*  

See also comments at:

http://charltonteaching.blogspot.co.uk/2014/03/comment-to-greg-cochrane-on-decline-of.html 

*