All Past Articles
(Reverse Chronological Order)
Preview: The task of reading a full human genome—with all its 3 billion base pairs—was first accomplished back at the turn of the century, through the monumental enterprise known as the Human Genome Project (HGP). In terms of the brass tacks, the HGP itself took over a decade, and cost nearly $3 billion dollars. Since that time, advances in technology have made gene sequencing vastly cheaper and less time consuming (reading a genome now takes less than a day, and costs less than $1,000). As a result, gene sequencing has become very commonplace, and thus the amount of data we have regarding our genome has grown enormously. Still, while the HGP was an incredible accomplishment, and the advances we have made since then are nothing short of remarkable, the secrets behind DNA have not been unlocked nearly as quickly as we might have hoped, or expected.
The main reason for this is that the more we discover about DNA, the more we discover just how complicated this molecule truly is. For one thing, as many of us now know, it is not the case that single genes code neatly for individual traits. Most traits are influenced by multiple genes (if not hundreds), and figuring out just what genes are involved, and in what ways, is often extremely difficult.
As if this were not already enough, scientists have also found that genes are capable of being turned on and off—and also up and down. In terms of the mechanics involved, some of these switches are flipped by other genes, in the normal course of development, while others are flipped by environmental triggers, and can change from moment to moment. What this means is that figuring out how genetics works does not just depend on knowing what genes are present, but how these genes express themselves (or fail to express themselves) at different times. The study of how and why genes express themselves differently at different times is known as epigenetics, and it adds a whole other layer of complexity to an already very complex affair.
That’s not all, though. Geneticists have also found that some of the genetic switches that are flipped by environmental triggers are capable of persisting for years—and even being passed down from one generation to the next. Thus genetic inheritance is not a simple matter of the genes themselves—as was once thought—but how these genes are tuned by the environment.
Still, unraveling all this complexity is not entirely without hope. Scientists have a number of tools at their disposal, and progress has been made in at least some areas. Read more…
Preview: Over the past 20 years, and particularly in the past decade, the stock market has undergone some significant changes. The most visible change is that much of the action has now become computerized. For example, whereas stock markets used to consist of trading floors, where floor traders swapped stocks back and forth, we now have computer servers where sellers and buyers are connected automatically. Now, on the one hand, this automation has led to some substantial efficiencies, as once necessary financial intermediaries have now largely become obsolete (this has led to savings not only because the old intermediaries earned an honest commission for their dealings, but because their privileged position sometimes led to corruption).
It is not that the new stock market has done away with intermediaries entirely. Take brokers, for example. Brokers are still used by large investors to help them move large chunks of stock where the market may not be able to fill the order immediately. The brokers take some risk in this action, and provide liquidity in doing so—since they help move capital to its most useful location—and thus brokers still provide a very useful service.
While brokers have always existed, the new stock market has also added a new breed of intermediary. This new breed of intermediary is known as the high-frequency trader (HFT). The high-frequency trader operates on speed, relying on location and advanced communications technology to learn about the movement of the market before others, and uses this knowledge to make winning trades.
To give you an indication of how important high-frequency trading has become, consider that at least half of the trades now being made in the United States are coming from high frequency traders.
Those who defend high-frequency trading argue that these quick trades actually help move money through the stock market, and thus add liquidity to the system (the way brokers do); and that, therefore, high-frequency traders provide a valuable service.
However, just how high-frequency trading works has largely remained a mystery to anyone outside of the industry itself; and many have become concerned that at least some forms of high-frequency trading are not so much liquidity-contributors as a way of scalping money off of trades that would have happened anyway.
In Flash Boys: A Wall Street Revolt, Michael Lewis follows one man who made it his mission to find out what was going on at the heart of HFT. That man is one Brad Katsuyama, a broker from the sleepy Canadian bank RBC. Ream more…
Preview: The unequal distribution of wealth in the developed world has become a significant issue in recent years. Indeed, the data indicate that in the past 30 years the incomes of the wealthiest have surged into the stratosphere (and the higher up in the income hierarchy one is, the greater the increase has been), while the incomes of the large majority have stagnated. This has led to a level of inequality in wealth in the developed world not seen since the eve of the Great Depression. This much is without dispute.
Where there is dispute is in trying to explain just why the rise in inequality has taken place (and whether, and to what degree, it will continue in the future); and, even more importantly, whether it is justified. These questions are not merely academic, for the way in which we answer them informs public debate as well as policy measures—and also influences more violent reactions. Indeed, we need look no further than the recent Occupy Movement to see that the issue of increasing inequality is not only pressing, but potentially incendiary.
Given the import and the polarizing nature of the issue of inequality, it is all the more crucial that we begin by way of shedding as much light on the situation as possible. This is the impetus behind Thomas Piketty’s new book Capital in the Twenty-First Century.
Preview: Up until 15 to 20 years ago the instruments and methods used to study the brain were still somewhat primitive. Since this time, however, advances in brain-imaging and brain-probing technology have gone into overdrive—as have the computers needed to make sense of the data coming out of these technologies. The deluge began in the early to mid 1990’s with the magnetic resonance imaging (MRI) machine, and it’s more powerful cousin the functional magnetic resonance imaging (fMRI) machine, and it hasn’t stopped there. In addition to the MRI and fMRI, we now have a host of advanced imaging and probing technologies from the positron emission topography (PET) scan, to magnetoencephalography (MEG), to near-infrared spectroscopy (NIRS), to optogenetics, to the Clarity technique, to the transcranial electromagnetic scanner (TES), to deep brain stimulation (DBS) and more. In addition to these new imaging and probing technologies we have also advanced greatly in understanding how genes are expressed in the brain.
The result of these new advances is that we have learned more about the brain and how it works in the past 15 years than in all of history put together. And we are beginning to see real-world applications of this new understanding. For example, in the past decade scientists have learned to read the brain’s functioning to the point where they can now read (and recreate) thoughts and even dreams and imaginings directly from the brain; use the brain to directly control computers, and anything computers can control—including prosthetics (and even have these prosthetics send sensations back to the brain); implant and remove simple memories in the brain; create primitive versions of artificial brain structures; and also unravel at least some of the mysteries of mental illness and disease. Read more…
The sciences that focus on human behavior, meaning the social sciences, have traditionally relied mainly on surveys and lab experiments in their investigations. While valuable to a degree, these sources of evidence do have their shortcomings. Most significantly, surveys offer but indirect evidence of human behavior (and can also be compromised by deception and self-deception); while lab experiments tend to be somewhat artificial, and fail to capture the complexities of real life.
Recently, however, new digital technology has opened up a whole new way to study human behavior. This proves to be the case since mobile devices and sensors of all kinds are now able to record a dizzying array of human activity—everything from where we go, to what we buy, to whom we interact with and for how long, to our body language, and even our moods etc. When placed in the hands of social scientists these new sources of information can prove very valuable (and are far preferable than either surveys or lab experiments); for they allow scientists to study us in our natural environments—out in the real world—and they also allow scientists to study what we actually do, rather than what we say (which are sometimes quite different).
The method of investigating human behavior in our natural environments using digital technology has come to be called reality mining, and it is revolutionizing the social sciences.
One of the pioneers and leaders in the field of reality mining is Alex Pentland, a researcher out of MIT. Pentland’s main field of interest is using reality mining to explore the properties and patterns of interactions between people—what he calls social physics. Specifically, Pentland uses reality mining to investigate the social physics in a wide range of groups and situations, from social and peer groups; to social media platforms; to institutional settings such as schools and businesses; to even whole cities. And in his new book Social Physics: How Good Ideas Spread—The Lessons from a New Science Pentland takes time out to catch us up on his findings. Read more…
Preview: That the earth’s climate is warming, and we are the main cause of this phenomenon (through the emission of greenhouse gases, including especially carbon), is now beyond dispute to anyone with an objective mind and an appreciation of science.
The clearest and most obvious effects of global warming are the melting of glacial ice and the corresponding rise in sea levels. But the effects of a warming world do not end here, we now know. The models tell us that warming also means less rain and even drought and desertification in some areas; more rain in others, often in deluges; stronger storms, such as hurricanes and cyclones; and an acidifying ocean.
On a human scale, this means salinated and eroding coast lines; desiccated farmland and more wild fires in drier areas; increased flooding and soil erosion in suddenly wetter areas; more destructive and deadly storms; and threatened sea life.
With all these negative effects, you would think that the people, companies and governments of the world would be eager to step in and do everything we can to stem the rising tide of climate change (including especially cutting emissions). Instead, however, what we have seen is much talk and little action.
There are several reasons for this complacency. Read more…
Preview: In the first machine age—otherwise known as the Industrial Revolution—we humans managed to build technologies that allowed us to overcome the limitations of muscle power like never before. The result, which has reverberated these past 200 years, has been an increase in economic productivity unprecedented in human history. And the corollary of this increase in productive power has been an increase in material standard of living and social development equally as unprecedented.
In the past 30 years, with the rise of computers and other digital technologies, we have moved from overcoming our physical limitations, to overcoming our mental ones. This is the second machine age. Though we are still at the dawn of the second machine age, it already shows at least as much promise in boosting productivity (and quality of life) as the first. Indeed, by various measures—including the standard ones of GDP and corporate profits—we can see that the past 30 years has witnessed an impressive steepening in productivity.
And this is just the beginning. For digital technology continues to advance at an exponential pace; more digital information is being produced (and kept) all the time (all of which has enormous economic potential); and new ways of combining existing (and new) ideas into newer and better ones are ever being found.
Still, what is equally apparent is that the benefits of this steepening in productivity have gone to the few, rather than the many. Indeed, while the top 20% of earners have seen their pay increase since the early 1980s (and the closer you are to the top the more dramatically your pay has increased), the bottom 80% has actually seen their wealth decrease. And the spread is widening ever more as we go. Read more…
Preview: The idea that we can boost our brain power through interventions of various kinds has been around a long time. Over the years, numerous drugs, diets and other practices (including everything from physical exercise to learning a new language or musical instrument to meditation to even zapping the brain with electrodes) have been purported to pump up our mental strength. And lately, a new practice has been added to this list: brain-training games and exercises. Indeed, in the past decade a whole new industry has emerged around brain-training programs. Built on the premise that specific types of mental activities can strengthen our cognitive skills and add to general intelligence, companies such as Lumosity and LearningRx have convinced millions of paying customers that their product will give them an edge in the brains department.
The more skeptical among us, however, may find ourselves wondering just what is the scientific basis behind all these brain games and other interventions. It was just this thought that occurred to science writer Dan Hurley; and so, following his skeptical sense, Hurley decided to investigate the matter for himself. What Hurley found was a scientific field that, though young, is bustling with activity (and controversy). Read more…
Preview: The modern city owes much of its current design to two major trends or ‘movements’ that have emerged since the time of the industrial revolution. The first trend traces back to the industrial revolution itself, when the appearance of smoke-billowing factories (and egregiously dirty slums) necessitated new solutions to the problem of how to organize city life. The answer—still reflected in cities all over the world—was to compartmentalize functions, such that industrial areas, shopping areas, office areas, and living areas were separated off from one another into distinct blocks of the city.
The second trend in urban design took full hold in the post-war era, with the rise of the suburbs. In a sense, the suburbs represent a continuation and intensification of the compartmentalization movement, as the living areas of the upper classes were separated-off still further from the other areas of the city—out into sprawling districts miles away (as automobiles made it possible for certain city dwellers to escape to an idealized haven away from the hustle and bustle).
While the suburban movement has had the bulk of its impact on the landscape outside of the city proper, the city itself has not been spared of its influence. For indeed, the city was gutted of many of the inhabitants that formerly occupied it; and, what’s more, it has been reshaped by the roads and freeways introduced to shuttle-in the suburbanites from their faraway destinations.
Now, it may well be the case that all this compartmentalization and suburbification was originally intended to benefit (most of) the city’s inhabitants. Unfortunately, however, the longer we live with these trends in urban design, the more it is becoming clear that this way of organizing the city leaves much to be desired. Read more…
Ever since the structure of DNA was deciphered by James Watson and Francis Crick in 1953, the field of biology has advanced at a lightning-quick pace. In this time, we have learned how DNA codes for the manufacture of proteins of which every living thing is made, and thus acts as the blueprint of life. We have also learned to read this blueprint; to splice it (to transfer genes, and hence features, from one organism to another—and even one species to another); to synthesize it from its component parts; and we have even learned to rewrite DNA to yield wholly new biological products, features and organisms. Thus recent advances have not only allowed us to gain a better understanding of what life is and how it works, but have also allowed us to take control of life and to manipulate it to help advance our ends—and in fields as wide-ranging as food production, medicine, energy, environmental protection etc. And this is just the beginning, for biologists still have much to learn about which genes code for what features, and how to manipulate DNA to achieve the best results—and thus we can expect that some of the greatest applications to come out of biology are yet to come.
The biologist J. Craig Venter has been at the forefront of biological research for the past 35 years, and has played a pivotal role in some of its most important advances (including everything from sequencing the human genome, to creating the first synthetic life form), and in his new book Life at the Speed of Light: From the Double Helix to the Dawn of Digital Life, Venter takes us through the major advances that have occurred since the time of Watson and Crick—and also touches on what is likely to come next. Read more…
Preview: In the developed world, the vast majority of us enjoy a standard of living unmatched in the history of humankind—and going hungry is the last thing on our minds. Nevertheless, it cannot be said that poverty and hunger have been eradicated in the developed world entirely (in the United States, for example, 1 in 6 are considered food insecure—including 16 million children). Still, the greatest problems with poverty and hunger continue to exist in the developing world. Indeed, despite substantial improvements over the past 30 years, poverty remains a significant issue, and nearly a billion of the world’s 7 billion people still face chronic hunger (while about twice that number are malnourished in some way)—and millions starve to death every year.
It is not that many well intentioned people and organizations have not spent a great deal of time and money trying to solve the world’s poverty and hunger issues. Indeed, over the past half century the amount of resources that have been poured into these problems is staggering. So, just why do the problems of poverty and hunger stubbornly persist?
Well, at least part of it has to do with the fact that there are several significant obstacles standing in the way—everything from armed conflict, to corrupt governments, to particular cultural practices etc. The humanitarian Howard G. Buffet has been involved in fighting poverty and hunger for upwards of 30 years, and knows these obstacles all too well. However, Buffet insists that there is yet another reason why all of the well-intentioned efforts have fallen short of reaching their ultimate goal. And that is that many of the approaches have proven to be inadequate (if not downright counter-productive). Read more…
Preview: Until quite recently, the field of economics was dominated mainly by theory-making. Specifically, economists applied their intellects to the human world, and developed abstract models to explain (and predict) the unfolding of economic events. At the heart of all this theory-making stood homo economicus—a narrowly self-interested individual who responded to incentives and disincentives in a perfectly rational way.
In the past half century, though, various economists have added new wrinkles to the field’s repertoire. To begin with, pioneering economists such as Amos Tversky and Daniel Kahneman introduced controlled lab experiments (among other things) into the fold. And these experiments succeeded in adding nuance to our understanding of economic-man (he’s not quite as one dimensional and rational as he was once taken to be), as well as texture and complexity to our understanding of economic phenomenon.
More recently, economists such as Uri Gneezy and John A. List have stepped in and showed that controlled field experiments also have a place in economics. For Gneezy and List, the world is their laboratory: the two go about slyly manipulating the natural environment in a controlled way (often fiddling with incentives and disincentives of all types) to see how we humans respond to the tweaks. Gneezy and List have been practicing this approach for upwards of 20 years now, and in this time they have helped shed light on everything from how to decrease crime rates; to how to improve school success; to how to encourage more charitable giving; to how to promote healthy living and decrease obesity; to how to set prices on products (so as to maximize profits); to how to understand (and limit) discrimination (to name but a few lines of research of theirs). And in their new book The Why Axis: Hidden Motives and the Undiscovered Economics of Everyday Life the two catch us up on their experiments and their results (while also touching on the experiments of other like-minded practitioners). Read more…
Preview: This book is not about underdogs and giants in any conventional sense of these terms. Rather, the book is about the curious nature of advantages and disadvantages, and how each can (under certain circumstances) become its opposite.
The first lesson to be learned is that the things we take to be advantages are often no such thing. Our greatest mistake here comes from the fact that we identify a certain quality or characteristic as being a benefit or advantage, and then assume that the more of it there is the better—when this is often not the case. Put another way, most of us recognize that it is possible to have too much of a good thing, and yet we fail to appreciate just how often and where this principle applies. For instance, we recognize that having a certain amount of money greatly facilitates raising children (it being very difficult to raise a family in a state of poverty), and yet we fail to recognize that beyond a certain point wealth also makes parenting increasingly difficult (for it becomes harder and harder to instill qualities of hard work and self-control). Or we recognize that small class sizes are a good thing, and yet we fail to recognize that classes can actually begin to suffer once they become too small (since diversity and energy begin to disappear). Read more…
Preview: Prior to the 19th century, public goods and social goals such as sanitation, health, affordable housing, education, and environmental protection were largely left up to individuals to sort out for themselves. Beginning in the 19th century, though, more and more governments—particularly in the industrialized, democratized world—began taking these responsibilities on themselves. In the latter half of the 20th century, the promotion of public goods and social goals expanded as governments in the developed world intensified their efforts at home and began spreading their attention to the developing parts of the planet, and large non-profits and NGOs started cropping up to help with the issues both domestically and abroad.
Recently, we have seen a new trend develop, as in the past two decades businesses and corporations have themselves increasingly entered the fray. Now, this may seem odd, given that business is often seen as indifferent—if not downright hostile—to public goods and social goals. However, several developments have occurred in recent years that have flipped this logic on its head. Read more…
Preview: When the early states came together to discuss the possibility of establishing a confederacy, they did so with a great deal of hope, but also a great deal of trepidation. The hope was that a federal government might be formed that could handle the few issues that were common to all the states but which could not be dealt with by the states individually. The fears, on the other hand, were that this government might come to gain an enormous amount of power; that this power might come to be concentrated in the hands of very few; and that the federal government as a whole might end up overreaching its purview and meddling in affairs that ought rightly to be left to the states and the various local governments (if not individuals themselves).
Thus the constitution was framed in such a way that the power of the federal government would be split between 3 separate branches—each acting as a check-and-balance on the power of the others. And the power of the federal government as a whole was limited to certain specific areas—all other areas being left expressly to the power of the states and local governments (and individuals).
Over the past century, though, this original arrangement has largely been undone. Indeed, after numerous constitutional amendments—and loose interpretations of the constitution itself—each of the branches of the federal government has, by turns, usurped (or been left with) more power than it was ever meant to have, and the federal government as a whole routinely involves itself in matters far from federal in nature—to the extent that it now insinuates itself into virtually every aspect of life, political, economic, and social.
For author and commentator Mark R. Levin it’s time we reversed this situation. For while those who made for the changes may have thought they were strengthening the nation, the fact is that the changes have contravened the very wise principles upon which the nation was built, and the practical results have been nothing but negative. Specifically, the changes have left the nation with nothing but ever-increasing taxes, ever-mounting debt, and ever-more soft tyranny for some with ever-reduced freedom for everyone else.
And the reform we need, according to the author, runs more than legislation-deep. It is reform that needs to happen at the very source: it is the constitution itself that must be reformed. For only radical constitutional reform can undo the radical and misguided reform that has come before. Read more…
Preview: In the recent past the K-12 public education system in the United States has been lackluster at best (some might say deplorable). Not that the various levels of government have not put in a great deal of effort (and money) to try and fix the problem; indeed, numerous attempts at education reform have been tried over the past 20 years or so, and the US currently spends more on public education per student than any other nation. Still, all of these good intentions (and boatloads of money) have achieved relatively little in terms of results. When compared with other developed nations, for example, American high school students currently rank 12th in reading, 17th in science, and a paltry 26th in math. These numbers would be concerning even at the best of times, but with the nation currently struggling through a seemingly endless economic slow-down, and with the global economy becoming increasingly competitive (and modern jobs requiring more and more advanced cognitive skills all the time), these numbers are very troubling indeed.
All is not lost, though. Other nations have shown that they are able to achieve far better academic results using far less money, and thus we may deem it high time that we investigate just what the leading nations are doing different that has allowed them to be so successful. It is this very project that journalist Amanda Ripley sets for herself in her new book The Smartest Kids in the World: And How They Got That Way. Read more…
Preview: What does it take to become an elite athlete? The intuitive answer for most of us is that it probably takes some lucky genes on the one hand, and a whole heck of a lot of hard work on the other. Specifically, that we may need to be blessed with a particular body type to excel at a particular sport or discipline (after all, elite marathon runners tend to look far different than elite NFL running backs, who in turn tend to look far different than elite swimmers), but that beyond this it is practice and diligence that paves the way to success. When we look at the science, though—as sports writer David Epstein does in his new book The Sports Gene: Inside the Science of Extraordinary Athletic Performance—we find that the story is much more complicated than this. In general terms we find that nature and nurture interact at every step of the way in the development of an elite athlete, and that biology plays far more of a role (and in far more ways) than we may have expected. Read more…
Preview: This is not a book about the end of the internet, as the controversial title may seem to suggest. Rather, it’s a book about networks (meaning a group of interconnected people or things) and how networks evolve; and its main focus is on internet-related networks and the internet itself (which is one enormous network). The author, Jeff Stibel, argues that there are certain natural laws that govern the unfolding of networks, and that understanding these laws can help us understand how the internet (and other internet-related networks) are likely to evolve over time, and also how we should approach these networks in order to get the most out of them (including make money off of them). Read more…
Preview: Sophisticated, humanoid robots as featured in such movies as RoboCop and Terminator may not be with us just yet—but we shouldn’t let this fool us into thinking that we are not already in the incipient stages of the robot age. The fact is that rudimentary robots and other automated technologies have already been with us for several years, and advances in computing power, artificial intelligence and materials are even now quickly scaling up the range and functionality that our robots are capable of.
RoboCop and Terminator notwithstanding, robots already have a significant impact on our lives, and this impact will only increase as the technology advances. And one of the biggest impacts here has to do with the world of work, and the economy more generally. Specifically, robots have already shown themselves to be capable of numerous jobs traditionally carried out by people, and as the technology advances the range and sophistication of the jobs subsumed by robots will only grow.
Now, the story of technology taking over human jobs is nothing new. Indeed, the loss of jobs has occurred every time a major new technology has been introduced, from the plow, to the power loom, to the steam engine, to the computer. In the past, though, the technologies that have usurped human jobs have also led to the growth of new jobs (normally requiring more advanced skills) that have ultimately offset, and even outstripped, the jobs that were lost originally.
With robotic technology, though, there is something new under the sun. Specifically, many of the new jobs that robotics will create will themselves be capable of being carried out by robots—largely due to the sophistication of the technology. What’s more, as robotics advances, the range of new jobs that are capable of being carried out by robots will only grow. I think we can see where this is going: fewer and fewer jobs for people.
In his new book Jobocalypse: The End of Human Jobs and How Robots Will Replace Them entrepreneur and writer Ben Way takes a look at how robots have already come to replace many human jobs, and how coming advances promise to intensify this trend and extend it to virtually every industry we can think of from custodial and maintenance services; to the supply chain; to transportation; to security services; to manufacturing; to construction; to farming and fishing; to mining; to retail and hospitality; to health care; to education; to the military and policing; and even the shadow economy. Read more…
* Free Article #38. A Summary of Creation: How Science Is Reinventing Life Itself by Adam Rutherford
Preview: As the blueprint of all that lives, deoxyribonucleic acid (DNA) may be said to be the key to understanding life itself. It is incredible to think, then, that the structure of DNA was only discovered some 60 years ago (thanks especially to the work of James Watson and Francis Crick). Since that time, many significant advances in genetics have been made—including the deciphering of the genomes of numerous species (including our own); and, even more impressively, the successful manipulation of the genetic code to introduce the features of one species to another (for example, having a goat produce spider’s silk out of its milk).
As impressive as these feats are, though, they are but the beginning of what promises to come from the study of genetics. Indeed, compared with other sciences, such as physics and chemistry, genetics is still in its infancy, and we can be assured that the most significant discoveries and applications are yet to come. Even now, geneticists are making substantial progress in uncovering the origin of life—meaning answering the question of just how life may have sprung out of lifeless chemistry—and are also making advancements in turning genetic manipulation into a standardized engineering science that is capable of churning out technological solutions in everything from food production to energy to medicine (a field that has been dubbed ‘synthetic biology’). It is these recent advances in genetics that are the main topic of Creation: How Science is Reinventing Life Itself by science writer Adam Rutherford. Read more…
Preview: Over the past half-millennium the West has built up a substantial lead over other parts of the world when it comes to both economic power and material standard of living. Now, however, this lead is slipping away. Indeed, developing nations led by such powers as China and India are quickly closing the gap, as they are experiencing impressive economic growth, while the West is stagnating. Many argue that this is the natural result of globalization (and the fact that major corporations are taking advantage of cheaper labor in developing nations). For Harvard historian and writer Niall Ferguson, however, there is something deeper going on here. For Ferguson, the closing of the gap between the West and the Rest has less to do with the rise of the Rest, as the decline of the West.
Specifically, Ferguson argues that it is the West’s political, economic, legal and social institutions that have allowed it to gain the upper hand over the past 500 years or so, and that now these institutions are beginning to deteriorate (just as other nations increasingly copy what made the West successful in the first place). The result: Western stagnation, and the catching up of everyone else.
Ferguson identifies 4 primary institutions that account for the West’s success over the past half-millennium: 1. Democracy; 2. Capitalism; 3. The Rule of Law; and 4. Civil Society. Each of these, the author argues, has eroded in the recent past. Read more…
Preview: Not so long ago the Internet was seen as the next great economic engine. The optimism was never higher than at the peak of the dot-com boom in the late 1990s, of course; but even after the dot-com bust in the early 2000s, many believed that this was but the growing pains of an emerging industry, and that in the long run the Internet would yet provide the foundation for a new and improved information economy.
Since that time, it is certainly the case that the Internet has spawned a few major successes (such as Google, Amazon, eBay and now Facebook), as well as a host of hopefuls (such as Twitter, Kickstarter, Pinterest and Instagram). However, it cannot be said that the economy has enjoyed a great boost since the Internet exploded. On the contrary, the economy has, at best, stagnated—and it currently shows no signs of escaping its slump. So what went wrong?
According to Silicon Valley luminary Jaron Lanier, the problem is not so much with the Internet per se, but with how it has been set up, and how the major Internet companies themselves are organized. Read more…
Preview: Many of us living in the developed world have come to rely very heavily on digital technology (including the internet and our mobile/smart devices)—indeed, for many of us, our relationship with our various screens is nothing short of addiction. And we are not the only ones who are plugging in. We are also increasingly hooking up our various man-made systems (such as our infrastructural systems and financial systems) to the internet as well. Given how radically digital technology has transformed our lives, it is incredible to think how recently all of this change has occurred; for, indeed, all of this technology has come upon us entirely in the past 15 to 20 years. This is significant because it reminds us that the age of connectivity is but in its infancy, and that most of the changes are yet to come.
This is true for us here in the developed world, but is even more so the case for those living in the developing world, where almost 5 billion people are expected to go from no connectivity to full connectivity within the next 20 years. While it may well be the case that the overall impact of the connectivity revolution will be enormously beneficial, we would be fool to think that the impact will be none but positive. With forces such as criminals, rebel groups, terrorists and rogue states prepared to take advantage of the new technology, the connectivity revolution poses some very serious challenges as well. Google executive Eric Schmidt and U.S. policy and media expert Jared Cohen are particularly well-placed to assess how all of the upcoming changes will play out, and in their new book The New Digital Age: Reshaping the Future of People, Nations, and Business the two let us in on their ruminations and prognostications. Read more…
Preview: Ever since the industrial revolution the developed world (and increasingly the developing world) has enjoyed remarkable economic growth. This economic growth has yielded wealth to a degree previously unimaginable. Indeed, many of us today enjoy conveniences, comforts and opportunities of a kind that have traditionally been unattainable by even the world’s wealthiest and most powerful people.
However, we may question just how sustainable all of this economic growth (and the resulting wealth) really is. For the economic growth has been accompanied by environmental depletion and degradation of a kind as unprecedented as the growth itself. And while some of the environmental crises that have come up along the way have been solved by new technologies, others yet remain, and are as daunting as any we have seen. Climate change in particular stands out as one of the greatest challenges we now face. What’s worse, many of the earth’s resources that we have used to generate the economic growth are dwindling, and face extinction. Indeed, the very resource that has powered the industrial era (and that has also caused many of our deepest environmental woes), fossil fuels, has now nearly peaked.
Looking to the past, we find that we would not be the first civilization to perish at the hands of a resource shortage brought on by overzealous extraction. Indeed, such an event has occurred on several occasions (including amongst the Mayan civilization, and that of the Easter Islanders).
So we find ourselves at a crossroads, unsure of whether our impressive economic growth can continue, and equally unsure of whether our lavish lifestyle lives but on borrowed time (and resources).
For writer Ramez Naam, though, we do have reason to be optimistic, and in his new book The Infinite Resource: The Power of Ideas on a Finite Planet Naam lays out the reasons for his optimism. Read more…
Preview: You open a bag of chips intending to eat only a few handfuls. You find the chips tasting quite good, and a few handfuls turns into a few more. Just one more… o.k., last one… definitely the last one. A few minutes later you find yourself staring down at an empty bag. Then your stomach starts to hurt—then your heart. The guilt isn’t far behind. Who among us hasn’t experienced this at one time or another? This is junk food in a nutshell: it tastes great (practically irresistible) and is very convenient, but if you indulge too much (which sometimes seems all too easy), it’s not very good for you. All of this has an easy explanation, it’s right there on the label: impressive portions of salt, sugar and fat, the junk food trifecta. Each has its own appeal, and each is very inexpensive (which explains why it’s in our food), but over the years each has also been implicated in some of our most common and serious conditions and diseases, including obesity, heart disease and diabetes.
Unfortunately, the junk food trifecta is not only popping up in our junk food, it is increasingly being featured in virtually all of the processed foods that we eat—from chips and soda, to canned food and prepared meals, to cake and ice-cream. And as salt, sugar and fat have become more common in the foods that we eat, the conditions and illnesses associated with their abuse have reached epidemic proportions. In his new book Salt Sugar Fat: How the Food Giants Hooked Us journalist Michael Moss takes us behind the labels and explores the history and practices of the processed food industry–a story that features the rise of salt, sugar and fat, and the deterioration of our health. Read more…
Preview: It is only recently, with the rise of the internet, that the term ‘viral’ has gone, well, viral. But the phenomenon of social pandemics—ideas, products and behaviors, that catch on and spread quickly and widely—has been around presumably as long as sociality itself. The phenomenon is interesting in its own right, for it says something meaningful about our psychology and how we interact. However, understanding how social pandemics work also holds great practical value, for when public service messages, charity campaigns or products and services go viral, the effect has a big impact on behavior and the bottom line.
On the mechanical side of things, understanding why something goes viral is straightforward enough: it must be something that has an impact, and that people are eager to talk about or imitate. But this just forces us to ask: what is it that makes something impactful, and ripe for sharing or imitating? We may think that our intuitions can carry us some way toward answering this. Nevertheless, getting something to go viral is certainly no easy task (as many a would-be influencer has come to find); and therefore, we may benefit from a more methodical, scientifically-minded attempt to understand the phenomenon. It is just such a project that Wharton marketing professor and writer Jonah Berger has been engaged in for much of his career, and in his new book Contagious: Why Things Catch On, Berger reports on his findings. Read more…
Preview: Statistical information, or data, has long been recognized to be a potentially rich and valuable source of knowledge. Until recently, however, our ability to render phenomena and events in a quantified format, store this information, and analyze it has been severely limited. With the rise of the digital age, though, these limitations are quickly being eroded. To begin with, digital devices that record our movements and communications, and digital sensors that record the behavior of inanimate objects and systems have become widespread and are proliferating wildly. What’s more, the cost of storing this information on computer servers is getting cheaper and cheaper, thus allowing us to keep much more of it than ever before. Finally, increasingly sophisticated computer algorithms are allowing us to analyze this information more deeply than ever, and are revealing interesting (and often counter-intuitive) relationships that would never have been possible previously. The increasing datification of the world, and the insights that this is bringing us, may be thought of as one grand phenomenon, and it has a name: Big Data.
The insights that are emerging out of big data are spread out over many areas, and are already impacting several aspects of society. Read more…
Preview: It is a deep part of human nature to want to understand our origins. Indeed, creation stories are ubiquitous among the world’s cultures. Somewhat fittingly, the vast majority of these creation stories have the human race emerging quickly, if not instantaneously—a revolutionary moment befitting a revolutionary species. When it comes to the story from science, on the other hand, while it may be no less spectacular, it is far less abrupt, for it has our species emerging much slower. Indeed, the latest findings indicate that we began branching away from the species to which we are most closely related—the chimpanzee—some 7 million years ago, and that only a series of small modifications spread out over this time has led us to our current state.
However long the process may have taken, though, in the end it was nevertheless revolutionary, for it has changed us from head to toe. Or rather, from toe to head, for the evidence indicates the process began with a modification in our big toe (which made upright walking easier) and ended with self-awareness (which ultimately made us interested in the story of our origin). While the rough edges of this story have been known for decades, recent fossil finds and new techniques in DNA analysis in the past 5 years have allowed the story to come into much clearer focus. Armed with these new discoveries, science writer Chip Walter takes on the story of human origins and evolution in his new book Last Ape Standing: The Seven-Million-Year Story of How and Why We Survived. Read more…
Preview: Our world is becoming increasingly integrated and complex, and changing faster and faster. Out of the morass of elements involved here, Al Gore identifies 6 themes or factors that are emerging as the major drive
rs of change. The factors are 1) Work: the movement of labor from West to East (outsourcing); and, at the same time, a shift towards much more automation (robosourcing); 2) Power: the shifting of power from West to East; and, at the same time, the shifting of power from national governments to smaller players, such as businesses and corporations, but also rogue players, such as guerrilla and terror organizations; 3) Communications: the rise of the internet that has led to a wild proliferation of information, and the ability of the world’s population to instantly connect with one another for a host of purposes–and the increasing reach of the internet from the developed to the developing world; 4) Biotechnology: the manipulation of DNA to produce not only new organisms with novel features, but new materials and fuels as well; 5) Demographics: the enormous increase in the world’s population, and the movement of peoples both within and across national borders (as the result of numerous factors); and 6) Climate Change: the increase in world temperatures caused by the continuing build-up of CO2, as well as the numerous other climate effects that this entails.
While several of these drivers of change have the potential to bring great benefits to the world’s people, all are fraught with potential dangers, and it is this that is Gore’s focus in his new book The Future: Six Drivers of Global Change. In addition to the dangers, Gore also reveals his own advice regarding how best to deal with the potential dangers. Read more…
Preview: Sir Arthur Conan Doyle’s character Sherlock Holmes is as popular today as when he was created back in the late 19th century. This comes as no surprise, of course, since there is just something about Holmes’ peculiar qualities—his keen observation, clever imagination, and incisive reasoning capabilities—that is both awe-inspiring and inspirational. We admire Holmes for cutting through the errors of thought that are so common to us in our daily lives (and that are reflected in Holmes’ sidekick, Watson). And yet we recognize that there is nothing in Holmes’ thought that is entirely out of reach for us. Indeed, his qualities are not so much superhuman as human plus: human qualities taken to their extreme. Still, human qualities taken to their extreme are intimidating enough, and we may find ourselves doubting whether we could ever really think like Sherlock—even if we put our minds to it. But for cognitive psychologist Maria Konnikova, we should think again.
Holmes’ prowess, Konnikova argues, rests no so much in his mental powers as in his mental approach. Specifically, Holmes has succeeded in making his thought methodical and systematic—essentially bringing the scientific method and scientific thinking to his detective work. This is an approach to thinking which, Konnikova argues, we can all learn. More importantly, it is an approach to thinking that can extend well beyond sleuthing. Indeed, it is a general approach that can help us get at the truth in virtually any matter, as well as help us solve virtually any problem. It is simply a matter of bringing a little science to the art of thinking—and it is this very thing that Konnikova aims to help us achieve in her new book Mastermind: How to Think like Sherlock Holmes. Read more…
Preview: The onset of agriculture and farming some 11,000 years ago (termed the Neolithic Revolution), is arguably the most significant turning point in the history of our species. Agriculture induced a major population explosion, which then led to urbanization; labor specialization; social stratification; and formalized governance—thus ultimately bringing us to civilization as we know it today. Prior to the Neolithic Revolution—and extending back time out of mind—human beings lived in a far different way. Specifically, our ancestors lived in small, largely egalitarian tribes of no more than 50 to 100 individuals, and hunted and foraged for their food.
The transition from our traditional hunting and gathering lifestyle, to early farming (and herding), to civilization as we know it now (which, on an evolutionary time-scale, occurred but yesterday) has certainly brought with it some very impressive benefits. Indeed, many of us today enjoy comforts and opportunities the likes of which our more traditional ancestors would never have dreamed of. However, it cannot be said that the transition from traditional to modern has left us without any difficulties. Indeed, some would go so far as to say that the problems that civilization has introduced outweigh the benefits that it has brought; and even the most unromantic among us are likely to agree that our experiment in civilization has not been an unmitigated success.
This then brings us to the problem of solving the difficulties that civilization has left us with. Now, when it comes to solving our problems, it is without a doubt the spirit of our age to look ever forward for solutions—by which I mean we tend to look for new technologies and hitherto untested arrangements to help us out of our current predicaments. However, when we consider that our traditional lifestyle served us well for millennia on end, and that it was under this lifestyle wherein we underwent much of the biological and psychological evolution that lives with us to this day, we can begin to see how it may be fruitful to look back at this traditional lifestyle for possible solutions to the problems we now face. (This idea is not new; indeed, the ‘state of nature’ has traditionally been of great interest to philosophers—for it has been thought that understanding how we lived by nature may serve as a guide to help us design the most fitting political communities given our present circumstances).
Also of interest here—and deeply connected to the more practical goal mentioned above—is that investigating our traditional way of life promises to shed light on our underlying human nature in a way that is not possible when we look at ourselves through the obscuring artifice of civilization. It is these things that we stand to gain by learning about traditional societies, and it is this very project that geographer Jared Diamond takes up in his new book The World Until Yesterday: What Can We Learn from Traditional Societies? Read more…
* 2012 *
Preview: The concept of fragility is very familiar to us. It applies to things that break when you strike or stretch them with a relatively small amount of force. Porcelain cups and pieces of thread are fragile. Things that do not break so easily when you apply force or stress to them we call strong or resilient, even robust. A cast-iron pan, for instance. However, there is a third category here that is often overlooked. It includes those things that actually get stronger or improve when they are met with a stressor (up to a point). Take weight-lifting. If you try to lift something too heavy, you’ll tear a muscle; but lifting more appropriate weights will strengthen your muscles over time. This property can be said to apply to living things generally, as in the famous aphorism ‘what doesn’t kill you makes you stronger’. Strangely, we don’t really have a word for this property, this opposite of fragility.
For author Nassim Nicholas Taleb, this is a major oversight, for when we look closely, it turns out that a lot of things (indeed the most important things) have, or are subject to, this property. Indeed, for Taleb, all that lives, and all the complex things that these living things create (like societies, economic systems, businesses etc.) have, or must confront this property in some way. This is important to know, because understanding this can help us understand how to design and approach these things (and profit from them), and failing to understand it can cause us to unwittingly harm or even destroy them (and be harmed by them). So Taleb has taken it upon himself to name and explore this curious property and its implications; and in his new book Antifragile: Things That Gain from Disorder Taleb reports on his findings. Read more…
Preview: When IBM’s Deep Blue defeated humanity’s greatest chess player Gary Kasparov in 1997 it marked a major turning point in the progress of artificial intelligence (AI). A still more impressive turning point in AI was achieved in 2011 when another creation of IBM named Watson defeated Jeopardy! phenoms Ken Jennings and Brad Rutter at their own game. As time marches on and technology advances we can easily envision still more impressive feats coming out of AI. And yet when it comes to the prospect of a computer ever actually matching human intelligence in all of its complexity and intricacy, we may find ourselves skeptical that this could ever be fully achieved. There seems to be a fundamental difference between the way a human mind works and the way even the most sophisticated machine works—a qualitative difference that could never be breached. Famous inventor and futurist Ray Kurzweil begs to differ
To begin with—despite the richness and complexity of human thought—Kurzweil argues that the underlying principles and neuro-networks that are responsible for higher-order thinking are actually relatively simple, and in fact fully replicable. Indeed, for Kurzweil, our most sophisticated AI machines are already beginning to employ the sample principles and are mimicking the same neuro-structures that are present in the human brain. Read more…
Preview: The adage ‘you are what you eat’ is no doubt literally true, but when it comes to getting at the heart of what we are it is certainly more accurate to say ‘you are what you think’; for our identity emerges out of the life of the mind, and our decisions and actions (including what we eat) is determined by our thoughts. An exploration of how we think therefore cuts to the core of what we are, and offers a clear path to gaining a better understanding of ourselves and why we behave as we do. In addition, while many of us are fairly happy with how our mind works, few of us would say that we could not afford to improve here at least in some respects; and therefore, an exploration of how we think also promises to point the way towards fruitful self-improvement (which stands to help us both in our personal and professional lives). While thinking about thinking was traditionally a speculative practice (embarked upon by philosophers and economists) it has recently received a more empirical treatment through the disciplines of psychology and neuroscience. It is from the latter angle that the Nobel Prize winning psychologist Daniel Kahneman approaches the subject in his new book Thinking, Fast and Slow. Ream more…
Preview: DNA was only discovered about a century ago, and it’s structure remained a mystery until about half a century ago, but since this time our knowledge and understanding of DNA has grown immensely (indeed exponentially). What’s more, this understanding has evolved to include not just an understanding of how DNA works, but also how it can be manipulated to help advance our ends. The most glaring example here is the phenomenon of genetically modified food. Though not without controversy initially (and some fringe opposition that lives on to this day), it is fair to say that genetically modified food was one of the major scientific advances of the 20th century. Over and above this, our understanding of DNA appeared to reach its most impressive manifestation with the successful sequencing of the human genome in the year 2000.
For the genetics professor and pioneering genetic engineer George Church, however, genetically modified food and the Human Genome Project are but the tip of the iceberg when it comes to the potential of genomics. Indeed, since the year 2005, the exponential growth rate in our ability to read and write DNA has increased from 1.5-fold per year (a rate that matches Moore’s law), to the incredible rate of 10-fold per year (p. 243). This explosion in scientific and technological progress has resulted in dramatic advancements in the areas of biochemicals, biomaterials, biofuels and biomedicine. What’s more, advancements in these technologies are but in their incipient stage, and the future of genomics promises to dwarf these initial achievements. In their new book Regenesis: How Synthetic Biology Will Reinvent Nature and Ourselves George Church and science writer Ed Regis take us through the developments that have occurred recently in the area of genomics, and also where these developments are likely to take us in the future. Read more…
Preview: Making decisions based on an assessment of future outcomes is a natural and inescapable part of the human condition. Indeed, as Nate Silver points out, “prediction is indispensable to our lives. Every time we choose a route to work, decide whether to go on a second date, or set money aside for a rainy day, we are making a forecast about how the future will proceed–and how our plans will affect the odds for a favorable outcome” (loc. 285). And over and above these private decisions, prognosticating does, of course, bleed over into the public realm; as indeed whole industries from weather forecasting, to sports betting, to financial investing are built on the premise that predictions of future outcomes are not only possible, but can be made reliable. As Silver points out, though, there is a wide discrepancy across industries and also between individuals regarding just how accurate these predictions are. In his new book The Signal and the Noise: Why So Many Predictions Fail–but Some Don’t Silver attempts to get to the bottom of all of this prediction-making to uncover what separates the accurate from the misguided. Read more…
Preview: Love and sex play a central role in the human drama. But when we talk about the emotions and decisions that we make in connection with them, we tend to remain strictly at the macro level, referring to people, and relationships, and our freely made choices. However, in their new book The Chemistry Between Us: Love, Sex and the Science of Attraction social neuroscientist Larry Young and journalist Brian Alexander contend that our biology and chemistry play a much bigger role in love and sex than most of us ever acknowledge (since Larry Young is the scientist behind the book [and responsible for the ideas therein], I will refer to him as the main author throughout). Young explores everything from gender identity (and sexual orientation), to romantic relationships (and parenting), to monogamy (and infidelity), taking us inside our bodies to investigate the genes and hormones that influence our approach to love, sex and relationships. While the focus here is on us humans, the evidence comes not only from our own species but from a host of other animals that exhibit similar biology and behavior. Read more…
Preview: When it comes to a child’s future success, the prevailing view recently has been that it depends, first and foremost, on mental skills like verbal ability, mathematical ability, and the ability to detect patterns–all of the skills, in short, that lead to a hefty IQ. However, recent evidence from a host of academic fields—from psychology, to economics, to education, to neuroscience–has revealed that there is in fact another ingredient that contributes to success even more so than a high IQ and impressive cognitive skills. This factor includes the non-cognitive qualities of perseverance, conscientiousness, optimism, curiosity and self-discipline–all of which can be included under the general category of `character’. In his new book How Children Succeed: Grit, Curiosity, and the Hidden Power of Character writer Paul Tough explores the science behind these findings, and also tracks several alternative schools, education programs and outreach projects that have tried to implement the lessons–as well as the successes and challenges that they have experienced. Ream more…
Preview: Up until very recently, news out of the European Organization for Nuclear Research (CERN) regarding the progress of the new Large Hadron Collider (LHC) had been slow in coming, and nary a major discovery had been announced. On July 4th, though, all of that changed. As on that day CERN announced the discovery of nothing less than the Higgs boson, the ‘God particle’.
The potential discovery of the Higgs boson had been one of the principal reasons why physicists were so excited about the LHC; and therefore, within the scientific community the announcement was cause for a major celebration indeed. For most of the general public, however, while the announcement was certainly intriguing, there were many basic questions yet to be answered: Just what was the Higgs boson, and why had it been labeled the God particle? Why were physicists expecting to find it, and what did the discovery really mean? Adequately answering these questions was more than what journalists were able to do in their compressed news segments and newspaper articles–and, besides this, it was a task that many journalists were not up to regardless.
Jim Baggott’s new book Higgs: The Invention and Discovery of the ‘God Particle’ is meant to remedy this situation and provide the necessary context that the general public needs in order to understand the discovery of the Higgs boson and what it all means. Read more…
Preview: We spend up to a third of our lives sleeping, and yet, unless we are not getting enough of it, and/or are experiencing a sleeping disorder of some kind, most of us hardly ever give our sleep a second thought (other than to rue over how much precious time it takes up). Science too largely neglected sleep for the longest time, treating it mainly as a static condition during which the brain was not doing much of anything interesting. However, ever since rapid eye movement (REM) was discovered in the 1950′s the science of sleep has really taken off, and the discoveries that have come out of it go to show that this unconscious period is more interesting than we ever could have imagined. It is these discoveries that writer David K. Randall explores in his new book Dreamland: Adventures in the Strange Science of Sleep. Read more…
Preview: In a sense the story of DNA has two strands. On the one hand, as the blueprint of all that lives and the mechanism of heredity, DNA tells the story of life (and the history of life), from the smallest, simplest microbe, to we human beings, who have managed to figure all of this out. Of course, there is still much about DNA that we don’t know. But given that we didn’t even know of its existence until a lowly Swiss physician and biologist named Friedrich Miescher stumbled upon it in the 1860′s, you have to admit we’ve come a long way in such a short time. And this is just where the second strand of the story of DNA begins: the story of our unraveling the mystery. While perhaps not as grandiose as the story of life itself, this detective story is significant in its own right, for it has transformed how we understand all that lives—including ourselves. This is especially the case given that the latest chapters in this story have revealed not only our own genomic blueprint, but the (deeply daunting) fact that we have the power to change this blueprint and thus became the masters of our own future as a species. While each of the strands of the story of DNA could fill a book in their own right (if not several), the author Sam Kean has managed to weave the two together and fit them both in his new book The Violinist’s Thumb: And Other Lost Tales of Love, War and Genius, as Written by Our Genetic Code. Kean’s project may seem like a particularly tall task, but he manages to pull it off by way of focusing in on only the main (and/or juiciest) moments and characters throughout. Read more…
Preview: There is certainly no shortage of lying, cheating and corruption in our society today. At their worst, these phenomena do substantial damage to our communities and the people in them. Picking on the corporate world for just a moment, consider a few high-profile examples from the last decade: the scandals at Enron, WorldCom, Bernard L. Madoff Investment Securities, Haliburton, Kmart, Tyco, Bristol-Myers Squibb, and a host of banks in the financial crisis of 2008.
If you are a particularly pessimistic person, you may think that people are fundamentally self-interested, and will engage in dishonest and corrupt behaviour so long as the potential benefits of this behaviour outweigh the possibility of being caught multiplied by the punishment involved (known as the Simple Model of Rational Crime or SMORC). On the other hand, if you are a particularly optimistic person, you may think that the lying and cheating that we see in our society is largely the result of a few bad apples in the bunch.
Given that the way we attempt to curb cheating and corruption depends largely on which view we think is correct, we would do well if we could come up with a proper understanding of these tendencies, and under what circumstances they are either heightened or diminished. Over the past several years, the behavioral economist Dan Ariely, together with a few colleagues, has attempted to do just this—by way of bringing dishonesty into the science lab. Ariely reveals his findings in his new book, The (Honest) Truth About Dishonesty: How We Lie to Everyone—Especially Ourselves. Read more…
Preview: Since the housing and financial crash of 2008, America’s economy has been stuck deep in the doldrums. Indeed, GDP has remained well beneath pre-2008 levels, and employment levels have failed to recover. In an effort to resuscitate the economy, the American government tried first to jump-start it through stimulus spending, and has now replaced this approach with greater austerity. Nothing seems to be working. For Nobel Prize winning economist Paul Krugman, though, the answer is clear: the problem is that the original stimulus effort was too small, and, since that time, the government is moving squarely in the wrong direction. Indeed, Krugman argues that America’s current situation bares a striking resemblance to the stagnation of the Great Depression, and that history has taught us what to do in such situations: the government must take an aggressive approach to stimulate the economy into recovery. This is the argument that Krugman makes in his new book End This Depression Now! Read more…
Preview: Since the housing and financial crash of 2008, America’s recovery has been tepid at best. Unemployment has remained high; manufacturing has not returned; personal savings are as low as they’ve ever been, and personal debt as high; housing is still a mess, and banking not much better; and, to top it all off, government debt is awe-inspiring and seems completely insoluble. According to financial investor, commentator and author Peter Schiff, while all of this is certainly disheartening, it should not come as much of a surprise. Indeed, Schiff argues that all of this economic slumping is a natural result of America’s misguided economic policies; including especially the Federal Reserve’s manipulation of interest rates, the government’s uncontrollable borrowing, and, in connection with this, the maintaining (and even expansion) of unsustainable social programs. For Schiff, these same policies led directly to the crash of ’08 (which he correctly and very famously predicted), and are leading the U.S. directly into an even worse crash now. In his new book The Real Crash: America’s Coming Bankruptcy—How to Save Yourself and Your Country Schiff outlines how America got itself into this mess in the first place, what the end game is likely to be, and what the nation and its citizens should do to make the coming unpleasantness the least unpleasant as possible. Read more…
Preview: Being the quieter, more reserved type, introverts are not as inclined as others to broadcast just who they are and what makes them tick, much less honk their own horns. However, given that Western culture has increasingly pushed introverts aside, and is intent on celebrating their opposite, it is high time that introverts stepped out of character, made themselves heard, and proclaimed to the world that they have much to offer indeed. This is the campaign that Susan Cain launches in her new book, Quiet: The Power of Introverts in a World That Can’t Stop Talking. Read more…
*Free Article #12. A Summary of Imagine: How Creativity Works by Jonah Lehrer
Preview: When we are lucky enough to be stricken with a particularly imaginative thought or creative idea, it often feels as though it is coming from outside of us—as though we are but the vehicle for its transmission. As a reflection of this, in the past artistic creativity was thought of as a force that was sent down from above, a gift from the gods that the artist was required to wait patiently for; the artist being but a vessel through which the force could act. The moment of epiphany is so sudden, so seemingly without precedent or cause, that it may seem to defy logical explanation, and hence to be outside of the bounds of scientific study. However, according to journalist and author Jonah Lehrer, science is beginning to understand how creativity works, and how it can be fostered, and it is this understanding that he brings to the table in his new book Imagine: How Creativity Works. Read more…
Preview: Since the dawn of self-awareness we human beings have struggled to understand ourselves. This struggle has found form in religion, philosophy, art and, most recently, science. The most pivotal turning point in science’s quest to understand humanity came with Charles Darwin’s theory of evolution by natural selection in the mid 19th century. While the application of this theory to understand human behaviour has taken time (and engendered a great deal of controversy), enough progress has now been made to outline the story in full, and to fill in several of the details. It is just this task that legendary biologist E.O. Wilson takes up in his new book The Social Conquest of Earth. Read more…
Preview: The old saying goes that we are never to discuss religion or politics in polite company. These topics are singled out of course because they tend to be the two that people are most passionate about, and which therefore have the greatest potential to cause enmity and strife. According to the psychologist Jonathan Haidt, the fact that we disagree over politics and religion is not necessarily such a bad thing. For him, though, the current wrangling between political and religious (and non-religious) factions has gotten rather out of hand, as it has recently reached such a pitch in the West (and particularly in America where Haidt resides) as to be threatening the very fabric of our nations.
Now, according to Haidt, at least some of the enmity and strife between people of different political and religious stripes is caused by a failure to understand precisely where these beliefs ultimately come from—as well as a failure to understand how one’s opponents understand their own beliefs. In an effort to remedy this situation, and to bring a degree of civility back into the ongoing debate, Haidt sets out to supply just these understandings in his new book The Righteous Mind: Why Good People Are Divided by Politics and Religion. Read more…
Preview: It is often said that we are creatures of habit, in that many of our daily activities end up being a matter of routine rather than direct deliberation (just think of your morning run-through). While this is no doubt true, author Charles Duhigg insists that this is but the tip of the iceberg when it comes to the impact that habits have on our daily lives. Indeed, in his new book The Power of Habit: Why We Do What We Do in Life and Business Duhigg argues that habits pervade not only our personal lives, but that they have an integral role to play in the businesses and other organizations of which we are a part, and that they are also at the heart of social movements and societies at large. Read more…
Preview: It has come to the point of cliché to say that, since the industrial revolution, and particularly in the past one hundred years or so, our level of technological innovation has advanced at an unprecedented rate and reached astronomical heights. It is clear that this innovation has provided us with countless benefits, and an enormous increase in our standard of living—at least for some of us. Indeed, it is equally clear that most of these innovations have benefited the developed world much more so than the developing world. Nevertheless, the gains have been so great, and the promise so overwhelming, that much of this period has been pervaded with a palpable optimism that we would eventually reach a stage where the whole world would benefit from the largesse, and we would perhaps even reach a technological utopia.
More recently, however, this optimism has given way to uncertainty, if not an outright crisis of faith, as it has become ever more clear that our technological innovation has left us with new and increasingly pressing problems, such as dwindling resources, global warming, and a population explosion that threatens to confound (and in some cases already does confound) our advances in agricultural production and medicine. Indeed, the problems that we face are so deep and pervasive that many have come to believe that we may have to pay for our era of decadence after all, and that the future is more likely to witness a collapse than the dawn of a utopian age.
However, in their new book Abundance: The Future Is Better Than You Think, Peter Diamandis and Steven Kotler argue that we needn’t discard our techno-optimism after all. Indeed, according to Diamandis, the world is on the precipice of another explosion in technology that will soon bring refuge from many of our current problems and abundance to our doorstep. Not content to let the goal or the timeline remain vague, Diamandis is happy to hang a more precise definition on each. When it comes to abundance, Diamandis defines it as “a world of nine billion people with clean water, nutritious food, affordable housing, personalized education, top-tier medical care, and non-polluting, ubiquitous energy” (loc. 317), and, to top it all off, the freedom to pursue their goals and aspirations unhindered by political repression. With regards to the timeline, Diamandis claims that it “should be achievable within twenty-five years, with noticeable change possible within the next decade” (loc. 580). Read more…
Preview: Lust, greed, gluttony, anger, sloth, envy and pride. The seven deadly sins are recognized as an integral part of the Christian (and especially the Catholic) belief system, and of Western culture more generally. Contrary to what many believe, though, the seven deadly sins did not make their first appearance in the Bible, but in the commentaries of Christian authorities in the early Middle Ages between the 4th and 6th centuries AD (loc. 2281). Equally unknown is that when the seven sins did arrive on the scene, they were meant primarily as a guide to monks in how they should conduct themselves in order to make monastic living as harmonious and holy as possible (loc. 60).
Despite their late arrival in the annals of Christian belief, though—and despite the somewhat niche audience that they were originally intended for—the seven deadly sins have since developed into an important component of the Christian faith. In fact, the influence of the seven deadly sins in Western culture extends well beyond the Christian realm. Indeed, even the atheistic among us are likely to regard the seven characteristics perhaps not as sins, but at the very least as character flaws, or vices.
Nevertheless, despite the near universal acknowledgement of the reproachfulness of the seven deadly sins, the psychologist Simon Laham takes a very different approach to these so-called sins in his new book The Science of Sin: The Psychology of the Seven Deadlies (and Why They Are So Good for You). Indeed, as the title suggests, Laham maintains that the seven deadly sins are not nearly as bad as they are cracked up to be, and in fact the author argues that much good can come of them, so long as they are approached in the right way. Read more…
Preview: At first glance it may seem like our sense of disgust is a fairly marginal and narrow aspect of our everyday experience (not to mention being a little icky), and therefore, not the most appetizing candidate for deep exploration. Nevertheless, in her new book That’s Disgusting: Unraveling the Mysteries of Repulsion, psychologist Rachel Herz demonstrates that there are in fact several aspects of disgust that make it unique among the basic human emotions (which include happiness, sadness, anger, fear, surprise and disgust), and worthy of closer attention. Read more…
Preview: Our best science tells us that the universe is an ever expanding entity consisting of some 400 billion galaxies that began with a very powerful and very hot explosion from a single point precisely 13.72 billion years ago. The degree to which our best science here has advanced in the recent past is reflected by the understanding of the universe that we had just a century ago. At that time, it was thought that the universe was static and consisted of just one galaxy: our own. In the past 100 years, though, Einstein’s theory of relativity revolutionized how we understand space and time and the physical processes operating at the very largest of scales, while quantum mechanics has revolutionized how we understand these processes at the very smallest of scales. It is the development of these theories in particular that has provided us with our current understanding of the universe.
However, the picture of the universe that these theories have furnished us with still leaves us with an apparent problem: What existed before the big bang? Surely something must have existed beforehand, for if nothing existed then something (indeed everything!) came from nothing, which seems absurd. Indeed there are few things more intuitively implausible than that something can come from nothing. In the philosophical community ex nihilo, nihilo fit (from nothing, nothing comes) is appreciated to be a self evident premise, and one of only a handful of postulates that are completely indisputable.
The apparent contradiction between the universe beginning at a finite time, and the premise that something cannot come from nothing, has often been used as an argument for the existence of an uncaused cause, or creator (most often understood as God). However, in his new book A Universe from Nothing: Why There is Something Rather than Nothing renowned physicist and cosmologist Lawrence Krauss argues that a full understanding of the science that has yielded our current picture of the universe also allows us to see that something can indeed come from nothing. Thus, for Krauss, science can in fact do the work that it is often thought only God could manage. As Krauss puts it (borrowing a line from the physicist Steven Weinberg), science does not make it impossible to believe in God, but it does make it possible to not believe in God (p. 183). In introducing us to the science that allows for the possibility of something coming from nothing, Krauss takes us through the history and evolution of physics and cosmology over the past century, beginning with Albert Einstein’s theory of relativity in 1916. In the course of this journey we learn about what our best science says about the basic make-up of our universe (including the existence of dark matter and dark energy), as well as what our best science tells us about how the universe (likely) began and where it is (likely) heading in the future. Read More…
Preview: The question of whether or not we truly have a free will has vexed humans for ages. On the one hand, it certainly feels as though we do: when it comes to the decisions that we make and the behaviour that we engage in, we experience the world as though it is ‘I’, the conscious self, who is responsible for these choices. Indeed, even though we may acknowledge that there are certain physical, biological, and social forces that influence our decisions and actions, we nonetheless feel as though ‘we’ are somehow separate from these impersonal forces, and that rather than being at their whim, it is ‘we’ who are the final arbiters in making the choices that we do. The experience of being able to choose as we wish is what we call free will, and it has traditionally been thought that it is an essential, if not the essential feature of what it means to be human.
However, as the study of the brain has progressed over the past century (and particularly in the past 40 years), the evidence seems to point more and more towards the idea that our sense of freedom, and our being in control of our choices, is a mere illusion, and that our thoughts and actions are in fact as determined as the physical world around us. The idea of a determined self not only challenges our traditional understanding of ourselves, but has practical repercussions in terms of our understanding of issues such as agency and responsibility, and forces us to ask whether we can legitimately hold people accountable for their actions. Indeed, if people truly are determined to behave as they do, then they could not reasonably be considered responsible for their behaviour, and hence it would seem to be unjust to punish them for their actions, thus throwing our entire judicial system into question. These issues have already begun to surface in our court systems, and have in fact had an impact on certain court decisions to exercise leniency on convicted offenders where this would not have occurred previously (p. 190-4).
According to neuroscientist Michael Gazzaniga, however, this whole line of thinking is both dangerous and misguided. This proves to be the case because, for him, the findings coming out of brain science do not in fact imply a determined self. Indeed, Gazzaniga claims that the idea of a determined self is based on a misinterpretation of the relationship between the mind and the brain, and that the proper interpretation of this relationship reveals that there is room for both responsibility and accountability. This is the argument that Gazzaniga makes in his new book Who’s in Charge? Free Will and the Science of the Brain. Read more…
* 2011 *
Preview: We are fresh out of a century that featured two world wars often considered to be the most destructive in history (not to mention numerous inter-state, civil and tribal wars and genocides), and are persistently submerged in news coverage that features more than its fair share of military conflict, terrorism, murder, gang violence, rape, domestic violence, child abuse and animal cruelty. As such, we may be forgiven for thinking that human beings are at least as violent as ever, if not more so. Indeed, many are persuaded that the onset of civilization some 5000 years ago has had none but a de-civilizing effect on the world and its people, and has led to an increasing level of violence as state hierarchies have grown in size and complexity, and military technology has advanced.
However, in his new book The Better Angels of Our Nature: Why Violence Has Declined, the Harvard scholar Steven Pinker argues that, all appearances to the contrary, an in depth look at the evidence reveals that violence has in fact decreased world-wide and in virtually every category we can think of since civilization began (albeit unevenly in both time and geography, and with a few blips along the way). The evidence comes not only from anecdotal and narrative tales but from an exhaustive look at the statistics, which is altogether very convincing. Read more…
Preview: Debt is certainly a topic of deep interest and import these days, what with future prosperity seemingly threatened on all sides by a combination of personal, commercial, and national debt. A fact that has been brought home with particular poignancy in recent times by the role that debt played in the latest global financial crash of 2008, and the continuing threat of growing consumer debt and national debts in places such as Greece, Ireland, Portugal, and now Italy, and, of course, the US. According to Anthropologist David Graeber, author of the new book Debt: The First 5000 Years, debt takes on an even larger significance when we trace its history, since this exercise allows us to gain a new and more complete understanding of economics as a whole, and our modern capitalist system in particular (not to mention several other aspects of the human condition to boot). The story of debt takes us from the origins of money itself; through to the age of slavery and conquest; on to the origins of the major world religions (with their near universal prohibitions on usury); through to the middle ages, and the beginnings of capitalism and the modern banking system; and finally on to the modern age itself with its national currencies, central banks, and commitment to market capitalism.
While this story is interesting in its own right, Graeber’s main argument here is that tracing the history of debt unearths some uneasy truths and deep flaws in the nature of modern capitalism, and it is high time, he proposes, that we rekindle the conversation about how and with what we might replace it. Read more…
Preview: As much as we rely on our brains to navigate the complex world before us, anyone who has ever forgotten someone’s name, or misread a situation, or made a poor decision in the heat of the moment knows that the brain does not always work as we would want. In his new book Brain Bugs: How the Brain’s Flaws Shape Our Lives, neurobiologist Dean Buonomano explores the brain’s many pitfalls and mistakes (and how and why it makes them), and also offers up some advice on how we can best manage these so called ‘brain bugs’ in our everyday lives. Read more…