Foolish Book Review: “Cambridge Handbook of Expertise and Expert Performance”
By Anders Bylund
January 30, 2008
Maybe you’re already a great stock picker, or perhaps you’re just starting to wet your toes in the financial markets. Either way, there’s always room for improvement when your personal wealth and retirement income is at stake.
That’s why I wanted to read The Cambridge Handbook of Expertise and Expert Performance, edited by K. Anders Ericsson.
The basicsHow would you go about improving your stock-picking skills as efficiently as possible?
The Handbook doesn’t discuss that particular skill very often, but there’s so much agreement among the methods for improving other abilities that it’s easy to connect the dots anyway.
Ericsson and his contributing writers highlight two essential attributes of the successful learner:
– Practice does make perfect
– Developing a talent requires “the support, encouragement, advice, insight, guidance, and goodwill of many others.”
Master-level chess players, for example, didn’t get there by talent alone. Multiple studies show that as much as 70% of a chess master’s skill rating is a direct function of the amount of deliberate practice invested — but tournament experience has almost no effect at all.
Here is the full story.
Do you agree?
That comment is misleading. Many strong players become strong by practice, study and play – the first two are obvious routes of improvement. The last (tournament play or other serious play – timed play) reinforces the practice and study – it makes the mind have to do something under pressure. Just playing alone however unless someone is a very gifted and a quick study won’t help – they’ll lose a lot of games and if they take the time to look them over, won’t understand why they lost or worse have a mistaken impression why.
The other examples used in the article are also a bit “off”. Playing with a symphony or other group activity does benefit from practice – the practice of working with others. Playing alone or with a metronome is quite different than playing with an orchestra and a human conductor. One could develop ground strokes in tennis hitting a ball against a wall or machine but playing a match where the opponent is deliberately varying angles, speed and spin requires a different set of skills.
So – not a bad point, but the examples are “off”
I agree with the earlier post.
Though on a simpler level, if success at chess is purely measured by one’s rating, then one gets to said rating by essentially winning those desirable fractions of “K” in head-to-head competition. The
very idea of “Master who didn’t play tournaments to become a master” is a logical fallacy!
“as much as 70% of a chess master’s skill rating” …
what kind of statement is that?!
70% of ELO 2800 = 1960, which leaves the world’s best at ELO 840 if they don’t practice. Total nonsense.
Save time – ignore the whole article.
Wow, so many Chess Grand Masters in this Blog thread! I am at awe at their collective wisdom.
I am somewhat familiar with Ericsson’s work, and in other papers he specifically mentions stock picking as one of the areas where his ideas about expert performance might *not* be applicable. So that’s one problem with the referenced article.
Two – I think there is validity to the assertion that tournament play alone does not lead to improvement, or that tournament play does not directly lead to improvement. Instead, players must analyze and study their games in order to get the most benefit out of the experience.
With respect to the 70% figure – there was a survey asking chess players how much they studied over the course of their playing careers, how much did they play in tournaments, whether they had a coach, what age did they start playing, etc. The variable with the strongest predictive value was total hours spent in “deliberate practice.” That, I believe, is how they arrived at the 70% figure. Also, amount of tournament experience did not correlate strongly with rating.
I’ll see if I can find one of the articles on the web…
Me again – here’s a link to the article I had in mind:
http://search.yahoo.com/search?p=chess+skill+acquisition+scientific&fr=yfp-t-501&toggle=1&cop=mss&ei=UTF-8
It’s be curious to read the source of the 70% figure. Sadly, the author of the article did not provide it. )=
Try looking at “The Role of Deliberate Practice in Chess Expertise,” by Neil Charness et al., Applied Cognitive Psychology 19, 151-165 (2005). I don’t know if that is the source for the 70% figure, but the paper strongly supports the position that study trumps tournament experience.