Tales from the Hairy Bottle

It's a sad and beautiful world

Thursday, May 26, 2005

An article on the BBC website last week reported that a half of the population of the earth now live in cities. This represents an amazing change in natural habitat for human beings, only 14% of whom in 1900 were city-dwellers. The report brought to mind an article I read in the New Left Review some time back by Mike Davis entitled Planet of Slums. In the piece, Davis pulls together some astonishing statistics on urbanisation and the effects on the world's poor. I understand that he is now writing a book based on the article which is due out this month. The Davis article is my principal source for the statistics in this post, and I recommend it as a great read. Here are some of the facts and figures relating to urbanisation:-

  • While the world's urban population was just one billion in 1804, by 1985 it had risen to two billion and by 2002 it was three billion.


  • The present urban population (3.2 billion) is larger than the total population of the world in 1960. The global countryside, meanwhile, has reached its maximum population (3.2 billion) and will begin to shrink after 2020. As a result, cities will account for all future world population growth, which is expected to peak at about 10 billion in 2050.


  • Cities have absorbed nearly two-thirds of the global population explosion since 1950 and are currently growing by a million babies and migrants each week.


  • By 2025, according to the Far Eastern EconomicReview, Asia alone could have ten or eleven conurbations with more than 20 million inhabitants (the estimated urban population of the world at the time of the French Revolution).


  • The population of Lagos, Nigeria has grown from 300,000 in 1950 to 10 million today. Lagos is located within a slum/shanty town corridor housing a population of 70 million people which stretches from Abidjan in Cote d'Ivoire to Ibadan, Nigeria


  • The impact of these statistics is liable to be clouded by our common conceptions of the relative merits of rural and urban life in the non-industrialised world. It is easy to fall into the trap of comparing the starving subsistence farmer with an, albeit underpaid and overworked, city-dwelling factory worker who at least has a roof over his head and food on the table. Television news normally brings us reports of famine from the fields and villages, not from cities, and thus our stereotype of Third World poverty is associated with rural lifestyles, while those in cities keep themselves busy and get by. The truth, however, is not so simple.

    Our point of reference for rapid urbanisation is the Industrial Revolution, which pulled much needed workers from the fields and into the factories. Living and working conditions were appalling in the early days, but the wheels were set in motion for the new working class to bootstrap themselves towards much higher standards of living in a newly industrialised society. Even in countries such as Ireland where the urban sector could not cope with the influx from the countryside, the promise of a new life in the empty New World countries offered a safety valve which is no longer on offer due to the 'gated community' immigration policies of the today's developed nations.

    The problem is that the current wave of humanity crashing onto the shores of our cities is arriving to find no signs of any industrialising boom. Instead, exacerbated by the neoliberal experiments perpetrated by the IMF/ World Bank, many cities are experiencing negative growth, and are shedding jobs rather than creating them. Average incomes in sub-Saharan Africa, for example, have fallen by around 20% since 1980. As the UN Report The Challenge of the Slums puts it:

    Instead of being a focus for growth and prosperity, the cities have become a dumping ground for a surplus population working in unskilled, unprotected and low-wage informal service industries and trade...The rise of [this] informal sector is . . . a direct result of liberalization.’

    There are nowhere near enough jobs to go round. For example, in Zimbabwe in the early 1990s only around 10,000 new urban jobs were created per year in face of an urban workforce increasing by more than 300,000 per annum. The result is that people are pushed into the 'informal' economy, working for example as street vendors, collecting recyclable waste, panhandling or indulging in petty criminal activities such as black marketeering or prostitution. In short, denied the opportunity of a job, these people have to use their own initiative to find any opportunity to make the money to survive. It is estimated by the UN that informal workers make up about two-fifths of the economically active population of the developing world. In Latin America, 57 per cent of all jobs lie in this sector, and 80 per cent of all new 'jobs' are in the informal economy. There is barely enough money for subsistence level rations, never mind about rent for accommodation. It is estimated that 85 per cent of urban residents in the developing world occupy property illegally.

    Living conditions in the slum cities are routinely horrific. Two million urban babies die each year due to contamination of water by human or animal waste. 57 per cent of the urban population of the developing world lack basic sanitation. 40 per cent of slum-dwellers, according to the UN, live in such extreme poverty that it is deemed life-threatening.

    The scale of the problem is hard to get one's head around. The UN estimates the total number of urban slum-dwellers in the world at 921 million people - one sixth of humanity, 78.2 per cent of total urban population of the developing world (rising to over 99% in countries such as Ethiopia and Chad). Incredibly, the figure is projected to rise to 2 billion by the 2030's, and shows no sign of slowing down.

    What hope is there for these people? Very little it would seem without radical change in the world economy. The combination of radical free market reforms and privatisation of the public sector at home, while export markets in the industrialised world remain largely closed due to direct and indirect protectionist barriers, have exacerbated rather than improved the situation. No short term measure or injection of cash is going to resolve the problem, but it is important that measures are taken to at least turn the tide and inject some hope into these communities.

    More than half of the slum-dwellers are under the age of twenty. This represents an enormous force of humanity growing up disaffected and without hope of improvement in their lives. Such forces of disaffection have always sought outlets for their anger and helplessness. The danger is that politics is the opium for these desperate people, while religion is the amphetamine. For example, Palestine has seen a gradual erosion of support for left-wing activist organisations while those such as Hamas continue to increase in popularity based on their combination of humanitarian support in the community and violent jihad against their perceived oppressors. Realistic hope is usurped by fundamentalist blind faith.

    It is important that we provide these people with opportunities to escape from urban poverty. In this context at least, the rhetoric of Bush is correct. Working democracy is an essential pre-requisite to give these people the chance of being heard, and to offer political representation for their cause. Their sheer numbers give them plenty of political clout given a level playing field. The populist regime led by Hugo Chavez in Venezuela, which has implemented significant reforms and aid for the urban poor, is a classic example of what their democratic power can achieve.

    Backing this up, the governments of these countries must be given sufficient support in tackling these issues, not just in terms of aid, but in creating a fair global trading system and offering the necessary expertise where required. Free market dogma needs to be replaced with practical development models to support growth in a sustainable manner.

    Finally, our attitudes to poverty need to evolve. The fact that so many of these slum areas are dangerous no-go zones, and are often less telegenic for those wishing to tug at our heart-strings, should not prejudice our understanding of global poverty in the 21st century. Rural poverty will remain and agricultural famines will affect all citizens, but we should keep in mind that the world poverty crisis has now by-and-large moved to the big city.

    Tuesday, May 17, 2005

    Worldwatch have published Vital Signs 2005, their assessment of global economic, socio-political and environmental trends. The amassing of statistics from a wide range of sources reveals some fascinating statistics about the state of the world.

  • Global unemployment has risen to 6.2 percent, up from 5.6 percent in 1993. The lack of job opportunities has been linked to ongoing instability in places such as the Middle East, where 58 percent of the population is under the age of 25 and a quarter of working-age youth are unemployed.


  • It is interesting to put this statistic in the context of policy priorities in the developed world: pension shortfalls due to greying populations and immigration. There is surely room for sensible reform of the latter to soften the blow of the former.

  • Both emissions and atmospheric concentrations of carbon dioxide (CO2) are accelerating: U.S. energy-related emissions are the highest, and rose 16 percent between 1990 and 2003. China ranks second in its total emissions, up more than 47 percent since 1990. The average atmospheric CO2 concentration has risen 35 percent since the dawn of the industrial age, to 377.4 parts per million by volume in 2004.


  • I am not at all optimistic about these trends reversing in the near future. The US Government is using every trick in the book to deny the self-evident, and thus undermining attempts at international consensus. Meanwhile, the rapid development of China and India risk making piecemeal efforts at emissions reduction elsewhere irrelevant in any case. If it is not too late already, the only realistic hope comes from the development of environmentally-friendly energy sources which are attractive on the open market. The future for the global climate looks bleak.

  • World oil consumption surged by 3.4 percent in 2004, the fastest rate of increase in 16 years.


  • In the continental U.S., oil production peaked at 8 million barrels per day in 1970 and fell to just 2.9 million barrels a day in 2004.


  • Production is falling in 33 of the 48 largest oil-producing countries, including 6 of 11 OPEC members.


  • Ever growing demand chasing increasingly scarce supplies will inevitably lead to conflict and economic turmoil. Surely this creates even more impetus to develop sustainable, economically viable alternative sources of energy. In relation to the amounts spent on other sectors, the money to finance effective development of technologies which we already know hold the key to future energy provision are tiny. The only thing impeding such an approach is pressure from those industries which stand to lose the most from such changes. No prizes for guessing which ones those are.

  • Despite a growing global grain harvest and rising meat production and consumption, the number of hungry people around the world has increased for the first time since the 1970s, to 852 million daily. Estimates suggest that programs to cut world hunger in half would cost $24 billion annually.


  • The cumulative number of people infected with HIV/AIDS reached 78 million in 2004—nearly double the 1997 total. Big wild cards for the future are China and India, where two fifths of the world population lives and where HIV/AIDS surveillance efforts remain inadequate. Spending just $10 billion a year on a global HIV/AIDS program and $3 billion to control malaria in sub-Saharan Africa would save millions of lives.


  • Programs to provide clean water and sewage systems would cost roughly $37 billion annually; to eradicate illiteracy, $5 billion; and to provide immunisation for every child in the developing world, $3 billion.


  • Meanwhile...

  • Military spending also surged: every hour of every day, the world spends more than $100 million on soldiers, weapons, and ammunition.


  • High-income countries, home to only 16 percent of the world’s people, account for $662 billion, or 75 percent, of global military expenditures.


  • Military budgets of high-income countries are roughly 10 times larger than their combined development assistance.


  • It may not be a particularly original point to make, but can it be stated too many times? According to these figures it would cost the world $82 billion to halve world hunger, save millions of lives senselessly lost to AIDS and malaria, provide universal clean water, and immunise and give basic education to every child in the developing world. This money could be found by reducing global arms spending by less than ten per cent. Anyone else think that a world which provides these basic needs to all its citizens will be more than 10% safer, and less likely to provoke conflict between the haves and have-nots?

    Saturday, May 14, 2005

    Is it possible that a diet of Reality TV and Grand Theft Auto can make you more intelligent? This unlikely theory is put forward by Steven Johnson in his new book Everything Bad Is Good For You. The ideas contained in the book have attracted columns in the New York Times and glowing praise from Malcolm Gladwell in the New Yorker.

    Johnson observes that average IQ scores have steadily risen in the modern era and ties this among other things to the increased complexity of the television we watch, and the interactivity and problem-solving skills fostered by the post-Space Invaders/Pacman generation of video games.

    For example, television shows such as "24", The Sopranos and ER contain very complex narrative structures. Many simultaneous plots interweave themselves between numerous complex characters. These plots may lie dormant for several episodes, only to reappear suddenly, requiring from the audience an instant recall of the back-story. Dialogue is often replete with technical jargon, which lends authenticity to the drama, but which must be filtered by the audience to pick out any information directly relevant to the story. Johnson contends that shows from previous generations did not challenge the audience to anywhere near the same degree, and that such shows therefore encourage multi-threaded thinking and enhanced cognitive skills. He even proposes that today's trash is more educational than that of yesteryear, suggesting that reality TV shows encourage thought and analysis of the contenders' social skills and strategies.

    What I can agree with in Johnson's theory is that the post-"Hill Street Blues" genre of multi-threaded drama series places more demands on the viewer than the majority of older popular dramas. "24" is more cerebral than "Starsky and Hutch", "The Simpsons" is beyond comparison with "The Flintstones". However, it is quite a stretch to use this as evidence of a causal link to increases in intelligence.

    Firstly, there are anomalies in the theory. M*A*S*H had many of the qualities Johnson points to in modern dramas, and was not considered too complicated or morally ambiguous for its enormous audience. "The Twilight Zone" deliberately set out to disorientate its viewers, and episodes stand up to repeated viewing even now. If his theory is correct, would we not expect to see similar developments in cinema? I would suggest that in the same period that cutting-edge television has become more challenging, movies have dumbed down. My theory would be that, perhaps aided by Hollywood's move toward predictable feel-good formulaic storylines, the cream of script-writers been attracted to jobs in television, leading to a migration in quality from large to small screen.

    Johnson likes to compare Reality TV more to sports than game shows because of the often gladiatorial environment programme makers like to foster. He points to the fact that such shows lead to debates next-day in the office similar to those seen in the wake of sports events. This brings to my mind a comparison perhaps not intended by the author. Dedicated TV sports fans will know the in-depth life stories of hundreds of competitors. Every game is enriched by such history, and further develops the unpredictable plots which interweave continually between these protagonists. Has any drama ever been as complex as a season of Premier League soccer or Major League Baseball? To say that a show involving, say, twenty characters and twenty plot lines is breaking new ground will be news to any armchair sports addict. Watching a game also requires a great deal of focus over an extended period of time if one wishes not to miss anything. Has anyone suggested that such couch potato habits have improved intelligence?

    An interesting corollary is that the trend in the US towards complex drama and comedy has not been evident at all in the UK. In the seventies and eighties, with the notable exception of M*A*S*H, I would say that the general quality of UK comedy was greatly superior to that coming out of North America, if not from the point of view of complexity, then at least in terms of writing and acting. The last twenty years have seen the bar raised considerably by the kind of American shows mentioned above. I can think of no comparable dramas coming out the UK, and only the odd comedy. Most of the successful British exports are detective shows or costume dramas, enjoyed for their production values, settings and quirky English characters. The majority are based on successful books. ER, "24", "The Sopranos" etc. are all successful imports, but the idea of home-growing such a style of programme has not been taken on.

    Johnson's other main strand involves the cerebral value of video games. His contention is that the premise of many of the more complex games, in which the player can only advance by solving what are often very complicated problems, is conducive to the development of transferrable problem-solving skills.

    This idea seems to me to have more mileage than his TV-related thoughts, at least in theory. I can see the logic in the interactive "sit forward" media of the video game having more potential benefit than the "lean back" media of entertainment television. However, it is unfortunate that the development of whatever skills potentially may be offered by the playing of video games is so often linked to the meteing out violence in the context of the game. I can imagine that creative minds could potentially graft educational goals into the enjoyable, addictive format of problem-solving and overcoming challenges in a virtual environment. I am sure much work is already being done in this field, and that there will be much valuable mining of this seam as ideas about how such virtual environments can be applied to education are developed.

    Once again, however, the question arises as to whether this suggests a causal link between the playing of such games and measured improvements in intelligence among today's youth? I am still skeptical. I think that technology has undoubtedly had a role to play in improving minds, but not in the way suggested by Steven Johnson. I think it is more likely that the fact that more general technological developments both at work and at home have taken over many of the repetitive time-intensive chores of the past, leaving people to fill their work and leisure time with what are often more mentally challenging activities. Rather than entertainment media pushing the envelope, I'd suggest that increased automation has encouraged our mental development. Modern computing has also encouraged multi-tasking, making our brains more used to spreading our concentration over several subjects at once, rather than going in-depth concentration on one item. Today's TV producers and game-writers are thus merely finding a market for products which entertain our increasingly multi-threaded thinking patterns. Cognitive changes drive the entertainment market, not the other way round. There may be limited cerebral benefits from such entertainment, but such benefits should not be overstated.

    The complex televisual and gaming media is, without doubt, here to stay and will probably become more intricate as our various media strands continue to coalesce, with television becoming more interactive and virtual reality becoming more real, but however well developed these trends become television and video games will never offer the complexities of real life. All the problem-solving skills in the world will never be much use without in-depth knowledge of the subject, and such knowledge will at least for the foreseeable future have to be imbibed by traditional means of reading or human instruction. Moreover, excellence in lateral thinking will be of limited use if the beneficiary hasn't developed the social skills to build the human consensus to implement their innovative solution.

    To end on a positive note, the growth in popularity of a more mentally challenging brand of entertainment media is to be lauded as a positive development, and if there is a benefit from the popularisation of Steven Johnson's theory it will be that pastimes such as video gaming will be less stigmatised as time-wasters, and considered as potentially more interesting ways to spend some of our leisure time in the future, particularly if the content is made more socially acceptable. However, the dull conclusion I would draw is that participation in such media is not "bad", but is only "good for you" in moderation. The printed (or virtual) word remains the best source of learning, and those seeking interactivity will always find virtual reality a poor substitute for the opportunities provided by the real thing.

    Monday, May 09, 2005

    As President Bush today considers the historical significance of an American president reviewing the procession of Russian troops and armaments in Red Square, I wonder if he will also consider the irony of the fact that as he and President Putin shoot the breeze, they still hold the anachronistic, but no less dangerous, threat of mutual nuclear annihilation over one anothers' heads. The danger may not be a direct one - no-one seriously expects a pre-emptive strike from the Russians anymore - but thousands of nuclear warheads are still nevertheless on hair-trigger alert, waiting for someone to decide that some danger is so threatening that it is worth destroying the world for.

    Robert McNamara, secretary of defense in the Kennedy and LBJ administrations, and thus a key figure in determining American nuclear strategy in a period including the Cuban missile crisis and the Vietnam war, writes a piece in the current issue of Foreign Policy entitled "Apocalypse Soon". The article could well be subtitled "How I Learned to Start Worrying and Hate the Bomb". McNamara sternly criticises Bush's investigation of new alternative types of nuclear weapons to add to existing stockpiles and the lack of American leadership in reducing the number of active nuclear warheads:-

    The average U.S. warhead has a destructive power 20 times that of the Hiroshima bomb. Of the 8,000 active or operational U.S. warheads, 2,000 are on hair-trigger alert, ready to be launched on 15 minutes’ warning...On any given day, as we go about our business, the president is prepared to make a decision within 20 minutes that could launch one of the most devastating weapons in the world. To declare war requires an act of congress, but to launch a nuclear holocaust requires 20 minutes’ deliberation by the president and his advisors. But that is what we have lived with for 40 years...What is shocking is that today, more than a decade after the end of the Cold War, the basic U.S. nuclear policy is unchanged. It has not adapted to the collapse of the Soviet Union. Plans and procedures have not been revised to make the United States or other countries less likely to push the button...Keeping such large numbers of weapons, and maintaining them on hair-trigger alert, are potent signs that the United States is not seriously working toward the elimination of its arsenal and raises troubling questions as to why any other state should restrain its nuclear ambitions.

    One does have to ask the question as to why exactly there is the need to be able to destroy the world so many times over. Would halving the megadeath quotient in mutual agreement with the Russians really be such a bad thing, especially if tied to commitments from some of the "outposts of tyranny"? Meanwhile the US looks to invest $2 billion in upgrading its weapons, pushing ever more closely towards a total US military expenditure to match that of the rest of the world combined, and even Tony Blair can't resist jumping on the bandwagon.

    And as for keeping so many weapons on hair-trigger alert, is that really such a good idea when a flock of birds or a rainstorm can cause the President to scurry into his bunker?

    Of course, in reality we know that there are considerable safeguards in place to ensure that no nuclear strike is authorised without due consideration by the President and his advisors. However, recent revelations have shown that any such faith in failsafe procedures to avoid unnecessary armaggeddon (as opposed to all those necessary armaggeddons) is seriously misplaced.

    Bruce Blair, a retired ICBM launch officer, has published astonishing revelations demonstrating how the safeguards have been overridden in practice without the knowledge of those at the top of the chain. His comments also perhaps reveal what triggered McNamara to publish his concerns regarding nuclear security:-

    Last month I asked Robert McNamara...what he believed back in the 1960s was the status of technical locks on the Minuteman intercontinental missiles...McNamara replied...that he personally saw to it that these special locks...were installed on the Minuteman force, and that he regarded them as essential to strict central control and preventing unauthorized launch.
    ...
    What I then told McNamara about his vitally important locks elicited this response: “I am shocked, absolutely shocked and outraged. Who the hell authorized that?” What he had just learned from me was that the locks had been installed, but everyone knew the combination.
    ...
    The Strategic Air Command (SAC) in Omaha quietly decided to set the “locks” to all zeros in order to circumvent this safeguard...Our launch checklist in fact instructed us, the firing crew, to double-check the locking panel in our underground launch bunker to ensure that no digits other than zero had been inadvertently dialed into the panel. SAC remained far less concerned about unauthorized launches than about the potential of these safeguards to interfere with the implementation of wartime launch orders. And so the “secret unlock code” during the height of the nuclear crises of the Cold War remained constant at OOOOOOOO.


    This amazing state of affairs was remedied in the seventies, but demonstrates how readily the military can, and will, override authority in order to maintain control of the agenda. In another article Blair quotes Air Force General George Lee Butler, who illustrates how control over the decision whether to retaliate would in practice be wheedled out of the hands of the President in a potential nuclear launch scenario:-

    "Notwithstanding the intention of deterrence as it is expressed at the policy level – as it is declared and written down – at the level of operations those intentions got turned on their head, as the people who are responsible for actually devising the war plan faced the dilemmas and blind alleys of concrete practice. Those mattered absolutely to the people who had to sit down and try to frame the detailed guidance to exact destruction of 80 percent of the adversary’s nuclear forces. When they realized that they could not in fact assure those levels of damage if the president chose to ride out an attack, what then did they do? They built a construct that powerfully biased the president’s decision process toward launch before the arrival of the first enemy warhead."

    Bruce Blair intimates that the situation has not improved since General Butler made his comments. Presidents still don't get into the detail of the protocols, and thus do not get to see where the devil resides. Let's all pray for plenty of blue, birdless skies until sanity is restored and the hair-trigger warheads are stood down.

    Sunday, May 08, 2005

    Last week's general election lived up to the tough challenge of being just as dull as the preceding campaign. No party (outside of Northern Ireland) will be overly pleased or disappointed by the outcome.

    Labour's declining majority was an inevitability which, ironically, will be seen positively by many Labour MP's and supporters as curbing the more outrageous schemes of their leaders. Conservatives were well aware of the fact that only a miracle could deliver them a victory, and in spite of a well orchestrated series of leaks of documents designed to embarrass the government, they ended up with only modest gains. The result will be viewed by most Tories with immediate disappointment by with an aftertaste of pragmatic optimism, in that the result at least puts them within range of Labour next time around. Liberal Democrats will celebrate their increase in seats, but will also ponder over a missed opportunity to exploit more fully the one-off potential for Labour defections in protest over the government's support for the War in Iraq.

    It is a clear sign of the lack of enthusiasm on all sides that much post-election speculation concerns what will happen at the next election rather than in the next term. The reduced authority of Labour will now hamstring radical policies such as the introduction of compulsory ID cards and further constitutional changes. It is also highly unlikely that Parliament would support a further military campaign in the Middle East against Iran or Syria, for example. It is expected, therefore, that the Government will concentrate more on more traditional Labour areas of interest, such as public sector initiatives. The most important issues in maintaining Labour's electability are the continued success of the economy, control of taxation, and navigating a careful passage through the tricky area of pensions reform.

    We can also expect to see three different leaders at the next election. Michael Howard has already announced that he will step down in the near future to be replaced presumably by one of the new generation, probably David Davies or Liam Fox I would think. Tony Blair will almost certainly step down in the next year or two to give Gordon Brown enough time to get his feet under the desk in advance of the next election campaign. I would also think that the Liberal Democrats will take it upon themselves to find an alternative to the affable but ultimately insipid Charles Kennedy.

    So, I think we can expect a Labour third term dogged with the problems typically (but perhaps not currently) associated with American Presidents' second terms. Blair and Brown will find it difficult to push through a radical agenda. The end result will be that the success of the Government may well be determined by outside events, such as the continued avoidance of terrorist attacks on British soil and global economic stability. As long as people don't feel too threatened by global terror, perceive that their pockets are not being unduly picked in the process of improving public services, if the housing market stabilises and interest rates remain under control, then the more restrained nature of the Government may make them more electable next time around.

    At the same time the Conservatives need to find a leader who can steer them on a more independent and compelling course than that pursued by recent incumbents without lurching obnoxiously further to the right. Perhaps they could encourage apathetic younger people to mobilise behind their cause - where the US had South Park Republicans, could we see Busted Conservatives? Oh well, perhaps not. The Conservative problem is a tough one to crack, and perhaps their best hope is that Labour failures will open the door for them.

    In any case, a more balanced Parliament will hopefully bring more sensible, well-thought out legislation than that passed in the previous term, and we can look forward to a real battle next time round.