Month: May 2017

How Our Autistic Ancestors Played An Important Role In Human Evolution

When you think of someone with autism, what do you think of? It might be someone with a special set of talents or unique skills such as natural artistic ability or a remarkable memory. It could also be someone with enhanced abilities in engineering or mathematics, or an increased focus on detail. The Conversation

This is because despite all the negative stories of an epidemic of autism most of us recognise that people with autism spectrum conditions bring a whole range of valued skills and talents both technical and social to the workplace and beyond.

Research has also shown that a high number of people not diagnosed with autism have autistic traits. So although many of these people have not been officially diagnosed, they might be were they to go for autism-related tests. These people were unaware they have these traits, dont complain of any unhappiness, and tend to feel that many of their particular traits are often an advantage.

The origins of autism

This is what we mean when we talk about the autism spectrum we are all a bit autistic and we all fit somewhere along a spectrum of traits.

And we know through genetic research that autism and autistic traits have been part of what makes us human for a long time.

Research has shown that some key autism genes are part of a shared ape heritage, which predates the split that led us along a human path. This was when our ancient ape ancestors separated from other apes that are alive today. Other autism genes are more recent in evolutionary terms though they are still more than 100,000-years-old.

Research has also shown that autism for the most part is highly hereditary. Though a third of the cases of autism can be put down to the random appearance of genetic mistakes or spontaneously occurring mutations, high rates of autism are generally found in certain families. And for many of these families this dash of autism can bring some advantages.

All of this suggests that autism is with us for a reason. And as our recent book and journal paper show, ancestors with autism played an important role in their social groups through human evolution because of their unique skills and talents.

Ancient genes

Going back thousands of years, people who displayed autistic traits would not only have been accepted by their societies, but could have been highly respected.

Many people with autism have exceptional memory skills, heightened perception in realms of vision, taste and smell and in some contexts, an enhanced understanding of natural systems such as animal behaviour. And the incorporation of some of these skills into a community would have played a vital role in the development of specialists. It is very likely these specialists would then have become vitally important for the survival of the group.

One anthropological study of reindeer herders said:

The extraordinary old grandfather had a detailed knowledge of the parentage, medical history and moods of each one of the 2,600 animals in the herd.

He was more comfortable in the company of reindeer than of humans, and always pitched his tent some way from everyone else and cooked for himself. His son worked in the herd and had been joined for the summer by his own teenage sons, Zhenya and young Sergei.

Autistic traits in art

Further evidence can be found in traits shared between some cave art and talented autistic artists such as those paintings found in the Chauvet Cave, in southern France. This contains some of the best preserved figurative cave paintings in the world.

The paintings show exceptional realism, remarkable memory skills, strong attention to detail, along with a focus on parts rather than wholes.

These autistic traits can also be found in talented artists who dont have autism but they are much more common in talented autistic artists.

Rewriting history

But unfortunately despite the potential evidence, archaeology and narratives about human origins have been slow to catch up. Diversity has never been a part of our reconstructions of human origins. It has taken researchers a long time to move beyond the image of a man evolving from an ape-like form that we so typically associate with evolution.

It is only relatively recently that women have been recognised as playing a key role in our evolutionary past before this evolution narratives tended to focus on the role of men. So its no wonder that including autism something which is still seen as a disorder by some is considered to be controversial.

And this is undoubtedly why arguments about the inclusion of autism and the way it must have influenced such art have been ridiculed.

But given what we know, it is clearly time for a reappraisal of what autism has brought to human origins. Michael Fitzgerald, the first professor of child and adolescent psychiatry in Ireland to specialise in autism spectrum disorder, boldly claimed in an interview in 2006 that:

All human evolution was driven by slightly autistic Aspergers and autistic people. The human race would still be sitting around in caves chattering to each other if it were not for them.

And while I wouldnt go that far, I have to agree that without that dash of autism in our human communities, we probably wouldnt be where we are today.

Penny Spikins, Senior Lecturer in the Archaeology of Human Origins, University of York

This article was originally published on The Conversation. Read the original article.

Read more:

Technorati Tags: , , ,

Knowing Names Makes Cooperation More Likely

Names are powerful things. Just knowing what someone calls themself makes us more likely to cooperate rather than compete with them, a new study has found. The discovery could prove useful in reducing conflict, but some writers of fiction might be asking scientists what took you so long?

Social scientists have spent much effort testing variations of the Prisoner’s Dilemma to find out under what circumstances people prefer to cooperate for the common good, rather than seek their own advantage. It’s hardly surprising that people are more likely to cooperate with those they already trust or are likely to interact with in the future. However, according to Zhen Wang and co-authors, even the tiny connection provided by knowing another participant’s name increases the chanceof cooperation.

Wang of the Northwestern Polytechnical University in China collaborated with researchers from five other countries on the project. They studied the behavior of 154 randomly paired Yunnan University undergraduates who interacted in repeated Prisoner’s Dilemma scenarios, where they had the choice to cooperate, defect, or punish the other player, with the knowledge they would probably have future rounds with the same partner.

Cooperators sacrificed one unit so the other player got two. Defectors got a unit for themselves at the expense of the other player, while those who optedfor punishment lost one unit while penalizing the other player four units, discouraging past defectors from repeating their actions.

In Science Advances, Wang reports that knowing each others’ names was enough to induce most participants to cooperate initially, in contrast to anonymous rounds where defection dominated.

The authors acknowledge the possibility the result was enhanced by the participants being classmates of similar background. Nevertheless, the effect was large compared to those seen from other attempts to tweak Prisoner’s Dilemma outcomes.

The Prisoner’s Dilemma has been intensively studied partly because human survival depends on cooperation, so finding ways to increase it has value. This work has been further spurred by the observation that cooperation is common in nature, in defiance of crude evolutionary models, which predicted it should seldom exist. The drive to understand the discrepancy has greatly expanded what we know about the workings of evolution, including studies of the way that cooperators, by grouping together, can offset the advantagegained by free riders who profit from others’ generosity while offering none of their own.

Although this may be the first scientific proof, the idea we care more for those whose names we know is widespread. Our names shape how people treat us, possibly enough to change how we look. In The West Wingseries, Leo McGarry complained he couldn’t eat the lobsters his daughter had named, a phenomenon common enough for the audience to recognize.

Read more:

Technorati Tags: , , ,

Your animal life is over. Machine life has begun. The road to immortality

In California, radical scientists and billionaire backers think the technology to extend life by uploading minds to exist separately from the body is only a few years away

Heres what happens. You are lying on an operating table, fully conscious, but rendered otherwise insensible, otherwise incapable of movement. A humanoid machine appears at your side, bowing to its task with ceremonial formality. With a brisk sequence of motions, the machine removes a large panel of bone from the rear of your cranium, before carefully laying its fingers, fine and delicate as a spiders legs, on the viscid surface of your brain. You may be experiencing some misgivings about the procedure at this point. Put them aside, if you can.

Youre in pretty deep with this thing; theres no backing out now. With their high-resolution microscopic receptors, the machine fingers scan the chemical structure of your brain, transferring the data to a powerful computer on the other side of the operating table. They are sinking further into your cerebral matter now, these fingers, scanning deeper and deeper layers of neurons, building a three-dimensional map of their endlessly complex interrelations, all the while creating code to model this activity in the computers hardware. As thework proceeds, another mechanical appendage less delicate, less careful removes the scanned material to a biological waste container for later disposal. This is material you will no longer be needing.

At some point, you become aware that you are no longer present in your body. You observe with sadness, or horror, or detached curiosity the diminishing spasms of that body on the operating table, the last useless convulsions of a discontinued meat.

The animal life is over now. The machine life has begun.

This, more or less, is the scenario outlined by Hans Moravec, a professor of cognitive robotics at Carnegie Mellon, in his 1988 book Mind Children: The Future of Robot and Human Intelligence. It is Moravecs conviction that the future of the human species will involve a mass-scale desertion of our biological bodies, effected by procedures of this kind. Its a belief shared by many transhumanists, a movement whose aim is to improve our bodies and minds to the point where we become something other and better than the animals we are. Ray Kurzweil, for one, is a prominent advocate of the idea of mind-uploading. An emulation of the human brain running on an electronic system, he writes in The Singularity Is Near, would run much faster than our biological brains. Although human brains benefit from massive parallelism (on the order of 100 trillion interneuronal connections, all potentially operating simultaneously), the rest time of the connections is extremely slow compared to contemporary electronics. The technologies required for such an emulation sufficiently powerful and capacious computers and sufficiently advanced brainscanning techniques will be available, he announces, by the early 2030s.

And this, obviously, is no small claim. We are talking about not just radically extended life spans, but also radically expanded cognitive abilities. We are talking about endless copies and iterations of the self. Having undergone a procedure like this, you would exist to the extent you could meaningfully be said to exist at all as an entity of unbounded possibilities.

I was introduced to Randal Koene at a Bay Area transhumanist conference. He wasnt speaking at the conference, but had come along out of personal interest. A cheerfully reserved man in his early 40s, he spoke in the punctilious staccato of a non-native English speaker who had long mastered the language. As we parted, he handed me his business card and much later that evening Iremoved it from my wallet and had a proper look at it. The card was illustrated with a picture of a laptop, on whose screen was displayed a stylised image of a brain. Underneath was printed what seemed to me an attractively mysterious message: Carboncopies: Realistic Routes to Substrate Independent Minds. Randal A Koene, founder.

I took out my laptop and went to the website of Carboncopies, which I learned was a nonprofit organisation with a goal of advancing the reverse engineering of neural tissue and complete brains, Whole Brain Emulation and development of neuroprostheses that reproduce functions of mind, creating what we call Substrate Independent Minds. This latter term, I read, was the objective to be able to sustain person-specific functions of mind and experience in many different operational substrates besides the biological brain. And this, I further learned, was a process analogous to that by which platform independent code can be compiled and run on many different computing platforms.

It seemed that I had met, without realising it, a person who was actively working toward the kind of brain-uploading scenario that Kurzweil had outlined in The Singularity Is Near. And this was a person I needed to get to know.

Randal Koene: It wasnt like I was walking into labs, telling people I wanted to upload human minds to computers.

Koene was an affable and precisely eloquent man and his conversation was unusually engaging for someone so forbiddingly intelligent and who worked in so rarefied a field as computational neuroscience; so, in his company, I often found myself momentarily forgetting about the nearly unthinkable implications of the work he was doing, the profound metaphysical weirdness of the things he was explaining to me. Hed be talking about some tangential topic his happily cordial relationship with his ex-wife, say, or the cultural differences between European and American scientific communities and Id remember with a slow, uncanny suffusion of unease that his work, were it to yield the kind of results he is aiming for, would amount to the most significant event since the evolution of Homo sapiens. The odds seemed pretty long from where I was standing, but then again, I reminded myself, the history of science was in many ways an almanac of highly unlikely victories.

One evening in early spring, Koene drove down to San Francisco from the North Bay, where he lived and worked in a rented ranch house surrounded by rabbits, to meet me for dinner in a small Argentinian restaurant on Columbus Avenue. The faint trace of an accent turned out to be Dutch. Koene was born in Groningen and had spent most of his early childhood in Haarlem. His father was a particle physicist and there were frequent moves, including a two-year stint in Winnipeg, as he followed his work from one experimental nuclear facility to the next.

Now a boyish 43, he had lived in California only for the past five years, but had come to think of it as home, or the closest thing to home hed encountered in the course of a nomadic life. And much of this had to do with the culture of techno-progressivism that had spread outward from its concentrated origins in Silicon Valley and come to encompass the entire Bay Area, with its historically high turnover of radical ideas. It had been a while now, he said, since hed described his work to someone, only for them to react as though he were making a misjudged joke or simply to walk off mid-conversation.

In his early teens, Koene began to conceive of the major problem with the human brain in computational terms: it was not, like a computer, readable and rewritable. You couldnt get in there and enhance it, make it run more efficiently, like you could with lines of code. You couldnt just speed up a neuron like you could with a computer processor.

Around this time, he read Arthur C Clarkes The City and the Stars, a novel set a billion years from now, in which the enclosed city of Diaspar is ruled by a superintelligent Central Computer, which creates bodies for the citys posthuman citizens and stores their minds in its memory banks at the end of their lives, for purposes of reincarnation. Koene saw nothing in this idea of reducing human beings to data that seemed to him implausible and felt nothing in himself that prevented him from working to bring it about. His parents encouraged him in this peculiar interest and the scientific prospect of preserving human minds in hardware became a regular topic of dinnertime conversation.

Computational neuroscience, which drew its practitioners not from biology but from the fields of mathematics and physics, seemed to offer the most promising approach to the problem of mapping and uploading the mind. It wasnt until he began using the internet in the mid-1990s, though, that he discovered a loose community of people with an interest in the same area.

As a PhD student in computational neuroscience at Montreals McGill University, Koene was initially cautious about revealing the underlying motivation for his studies, for fear of being taken for a fantasist or an eccentric.

I didnt hide it, as such, he said, but it wasnt like I was walking into labs, telling people I wanted to upload human minds to computers either. Id work with people on some related area, like the encoding of memory, with a view to figuring out how that might fit into an overall road map for whole brain emulation.

Having worked for a while at Halcyon Molecular, a Silicon Valley gene-sequencing and nanotechnology startup funded by Peter Thiel, he decided to stay in the Bay Area and start his own nonprofit company aimed at advancing the cause to which hed long been dedicated: carboncopies

Koenes decision was rooted in the very reason he began pursuing that work in the first place: an anxious awareness of the small and diminishing store of days that remained to him. If hed gone the university route, hed have had to devote most of his time, at least until securing tenure, to projects that were at best tangentially relevant to his central enterprise. The path he had chosen was a difficult one for a scientist and he lived and worked from one small infusion of private funding to the next.

But Silicon Valleys culture of radical techno-optimism had been its own sustaining force for him, and a source of financial backing for a project that took its place within the wildly aspirational ethic of that cultural context. There were people there or thereabouts, wealthy and influential, for whom a future in which human minds might be uploaded to computers was one to be actively sought, a problem to be solved, disruptively innovated, by the application of money.

Brainchild of the movies: in Transcendence (2014), scientist Will Caster, played by Johnny Depp, uploads his mind to a computer program with dangerous results.

One such person was Dmitry Itskov, a 36-year-old Russian tech multimillionaire and founder of the 2045 Initiative, an organisationwhose stated aim was to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier, and extending life, including to the point of immortality. One of Itskovs projects was the creation of avatars artificial humanoid bodies that would be controlled through brain-computer interface, technologies that would be complementary with uploaded minds. He had funded Koenes work with Carboncopies and in 2013 they organised a conference in New York called Global Futures 2045, aimed, according to its promotional blurb, at the discussion of a new evolutionary strategy for humanity.

When we spoke, Koene was working with another tech entrepreneur named Bryan Johnson, who had sold his automated payment company to PayPal a couple of years back for $800m and who now controlled a venture capital concern called the OS Fund, which, I learned from its website, invests in entrepreneurs working towards quantum leap discoveries that promise to rewrite the operating systems of life. This language struck me as strange and unsettling in a way that revealed something crucial about the attitude toward human experience that was spreading outward from its Bay Area centre a cluster of software metaphors that had metastasised into a way of thinking about what it meant to be a human being.

And it was the sameessential metaphor that lay at the heart of Koenes project: the mind as a piece of software, an application running on the platform of flesh. When he used the term emulation, he was using it explicitly to evoke the sense in which a PCs operating system could be emulated on a Mac, as what he called platform independent code.

The relevant science for whole brain emulation is, as youd expect, hideously complicated, and its interpretation deeply ambiguous, but if I can risk a gross oversimplification here, I will say that it is possible to conceive of the idea as something like this: first, you scan the pertinent information in a persons brain the neurons, the endlessly ramifying connections between them, the information-processing activity of which consciousness is seen as a byproduct through whatever technology, or combination of technologies, becomes feasible first (nanobots, electron microscopy, etc). That scan then becomes a blueprint for the reconstruction of the subject brains neural networks, which is then converted into a computational model. Finally, you emulate all of this on a third-party non-flesh-based substrate: some kind of supercomputer or a humanoid machine designed to reproduce and extend the experience of embodiment something, perhaps, like Natasha Vita-Mores Primo Posthuman.

The whole point of substrate independence, as Koene pointed out to me whenever I asked him what it would be like to exist outside of a human body, and I asked him many times, in various ways was that it would be like no one thing, because there would be no one substrate, no one medium of being. This was the concept transhumanists referred to as morphological freedom the liberty to take any bodily form technology permits.

You can be anything you like, as an article about uploading in Extropy magazine put it in the mid-90s. You can be big or small; you can be lighter than air and fly; you can teleport and walk through walls. You can be a lion or an antelope, a frog or a fly, a tree, a pool, the coat of paint on a ceiling.

What really interested me about this idea was not how strange and far-fetched it seemed (though it ticked those boxes resolutely enough), but rather how fundamentally identifiable it was, how universal. When talking to Koene, I was mostly trying to get to grips with the feasibility of the project and with what it was he envisioned as a desirable outcome. But then we would part company I would hang up the call, or I would take my leave and start walking toward the nearest station and I would find myself feeling strangely affected by the whole project, strangely moved.

Because there was something, in the end, paradoxically and definitively human in this desire for liberation from human form. I found myself thinking often of WB Yeatss Sailing to Byzantium, in which the ageing poet writes of his burning to be free of the weakening body, the sickening heart to abandon the dying animal for the manmade and immortal form of a mechanical bird. Once out of nature, he writes, I shall never take/ My bodily form from any natural thing/ But such a form as Grecian goldsmiths make.

One evening, we were sitting outside a combination bar/laundromat/standup comedy venue in Folsom Street a place with the fortuitous name of BrainWash when I confessed that the idea of having my mind uploaded to some technological substrate was deeply unappealing to me, horrifying even. The effects of technology on my life, even now, were something about which I was profoundly ambivalent; for all I had gained in convenience and connectedness, I was increasingly aware of the extent to which my movements in the world were mediated and circumscribed by corporations whose only real interest was in reducing the lives of human beings to data, as a means to further reducing us to profit.

The content we consumed, the people with whom we had romantic encounters, the news we read about the outside world: all these movements were coming increasingly under the influence of unseen algorithms, the creations of these corporations, whose complicity with government, moreover, had come to seem like the great submerged narrative of our time. Given the world we were living in, where the fragile liberal ideal of the autonomous self was already receding like a half-remembered dream into the doubtful haze of history, wouldnt a radical fusion of ourselves with technology amount, in the end, to a final capitulation of the very idea of personhood?

Koene nodded again and took a sip of his beer.

Hearing you say that, he said, makes it clear that theres a major hurdle there for people. Im more comfortable than you are with the idea, but thats because Ive been exposed to it for so long that Ive just got used to it.

Russian billionaire Dmitry Itskov wants to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier. Photograph: Mary Altaffer/AP

In the weeks and months after I returned from San Francisco, I thought obsessively about the idea of whole brain emulation. One morning, I was at home in Dublin, suffering from both a head cold and a hangover. I lay there, idly considering hauling myself out of bed to join my wife and my son, who were in his bedroom next door enjoying a raucous game of Buckaroo. I realised that these conditions (head cold, hangover) had imposed upon me a regime of mild bodily estrangement. As often happens when Im feeling under the weather, I had a sense of myself as an irreducibly biological thing, an assemblage of flesh and blood and gristle. I felt myself to be an organism with blocked nasal passages, a bacteria-ravaged throat, a sorrowful ache deep within its skull, its cephalon. I was aware of my substrate, in short, because my substrate felt like shit.

And I was gripped by a sudden curiosity as to what, precisely, that substrate consisted of, as to what I myself happened, technically speaking, to be. I reached across for the phone on my nightstand and entered into Google the words What is the human… The first three autocomplete suggestions offered What is The Human Centipede about, and then: What is the human body made of, and then: What is the human condition.

It was the second question I wanted answered at this particular time, as perhaps a back door into the third. It turned out that I was 65% oxygen, which is to say that I was mostly air, mostly nothing. After that, I was composed of diminishing quantities of carbon and hydrogen, of calcium and sulphur and chlorine, and so on down the elemental table. I was also mildly surprised to learn that, like the iPhone I was extracting this information from, I also contained trace elements of copper and iron and silicon.

What a piece of work is a man, I thought, what a quintessence of dust.

Some minutes later, my wife entered the bedroom on her hands and knees, our son on her back, gripping the collar of her shirt tight in his little fists. She was making clip-clop noises as she crawled forward, he was laughing giddily and shouting: Dont buck! Dont buck!

With a loud neighing sound, she arched her back and sent him tumbling gently into a row of shoes by the wall and he screamed in delighted outrage, before climbing up again. None of this, I felt, could be rendered in code. None of this, I felt, could be run on any other substrate. Their beauty was bodily, in the most profound sense, in the saddest and most wonderful sense.

I never loved my wife and our little boy more, I realised, than when I thought of them as mammals. I dragged myself, my animal body, out of bed to join them.

To Be a Machine by Mark OConnell is published by Granta (12.99). To order a copy for 11.04 go to or call 0330 333 6846. Free UK p&p over 10, online orders only. Phone orders min p&p of 1.99

Read more:

Technorati Tags: , , , , , , , , ,

Gifted brothers, 11 and 14, will attend college together in the fall

With summer right around the corner, most kids are looking forward to taking a break from homework and spending long days at the pool. Two Texas brothers, however, are exceptions to the rule their love of learning already has them looking forward to next school year and hitting the books once again.

When you see Carson Huey-You and his younger brother, Cannan, on the playground they look like ordinary siblings, doing ordinary activities. But this playful duo is anything but ordinary.

“I don’t really think I’m a genius at all,” Carson says.


Most of his friends, family and educators would beg to differ. While most kids were starting kindergarten at age 5, Carson had just completed the eighth grade.

I was 10 years old when I graduated high school, he explains.

Four years later, now 14, Carson just became the youngest person to ever graduate from Texas Christian University in Fort Worth. He majored in physics and picked up minors in mathematics and Chinese.

It’s a good language to learn. So many people speak it. You have those big businesses in China, so I started taking it and high school and eventually when I started going here I took it, he said.

While not even old enough to drive or legally vote, Carson is able to solve math problems that would give most people nightmares. He says he enjoys learning how things work and finds physics interesting because it can be considered abstract. In fact, Carson is so fascinated with science that he plans to now pursue a masters degree in quantum mechanics at TCU.


The teen will begin his graduate program in the fall, only this time he wont be completely alone on campus. His younger brother will also attend TCU next year, after just graduating from high school at age 11.

Yes, two academically gifted children in one family.

Cannan will focus his studies on engineering, astronomy and physics because hed like to become an astronaut when he grows up.

“I tell everyone they’re just normal kids but they’re advanced on an academic level,” their mother, Claretta Kimp, explains.

Kimp is a single mother with a background in education and business who mostly homeschooled her boys. She insists it was extremely important that she raise her children to not believe they were better than anyone else, just because of their intellect.

I must say that every child is special, she says. Im humbled. I love my boys more than life and I’m so proud of them. They are such great kids and it’s great to be their mom!

Casey Stegall joined Fox News Channel (FNC) in 2007 and currently serves as a correspondent based in the Dallas bureau. He previously served as a Los Angeles-based correspondent.

Read more:

Technorati Tags: , , ,

Scientists Discover 40 New Genes Linked To Intelligence

As the great debate over nurture versus nature continues, a team of geneticists have identified 40 new genes that have a direct influence over human intelligence. Writing in the journal Nature Genetics, the team conclude that there are now at least 52 genes that have a direct influence on a persons IQ.

Analyzing the genomes of 60,000 adults and 20,000 children, the team led by the Free University of Amsterdam found that these 40 new genes guide the construction of healthy neurons, as well as the synapse connections that branch between them.

Its likely that there are hundreds of additional genes that have an influence over IQ, so although this study represents the biggest haul yet in this regard, theres still a long way to go before the cartography of our cognitive abilities is complete.

The team note that these 40 new genes, when all other factors are ruled out, explain just 4.8 percent of the variation in human intelligence seen over their subjects. If 50 percent of a persons IQ can be explained genetically, then this means that there is a huge chasm of knowledge that geneticists have yet to fill.

These findings provide starting points for understanding the molecular neurobiological mechanisms underlying intelligence, one of the most investigated traits in humans, the authors write in their study.

Just to clarify straight off the bat these genes have an influence on intelligence, but environmental factors, including lifestyle, healthcare, socio-economic background, education, and so on also have a huge effect.

Furthermore, IQ tests two types of cognitive facets known as crystallized intelligence and fluid intelligence.

The former is the ability of a person to solve puzzles or answer questions when the parameters of the problem have already been understood or conveyed clearly mathematics is a good example of this. Fluid intelligence is the ability to solve brand new and more abstract problems, like navigating a maze, spotting hidden patterns, or even weaving through a conversation with a complete stranger.

Ooh! There’s one. (Note: This is not how science is actually done.) vchal/Shutterstock

There are plenty of other types of intelligence, including emotional intelligence the ability to empathize, to regulate ones own emotions, and to handle interpersonal relationships well. IQ does not take this into account, and neither do these 52 genes.

This study also only looked at the genomes of those with European descent. Other research groups will have to peer into the genetic makeup of those from other geographical settings to see if the same genes are present all over the world.

In any case, this is a remarkable study that represents a giant leap forward in our understanding of what has been referred to as the architecture of intelligence. Its a tall mountain to climb, but another ledge has just been scaled by this research team.

[H/T: Guardian]

Read more:

Technorati Tags: , , ,

7 Facts” You Learned In School That Are No Longer True

Over time, even facts we consider steadfast truths can change. People used to think doctors could forgo washing their hands before surgery. Knowledge is ever-evolving.

The sevenideas below probably changed since your school days. Re-educate yourself.

THEN: Pluto is a planet

NOW: Pluto isn’t a planet

We’ve known since the late 1800s that a ninth planet, after Uranus, potentially existed. In 1906, Percival Lowell, the founder of the Lowell Observatory in Flagstaff, Arizona, even began a research project intended to locate the mysterious “Planet X.”

Then in 1930, a 23-year-old newbie at the facility found it. The discoverer, Clyde Tombaugh, had been tasked with systematically comparing photographs of the sky taken weeks apart to search for any moving objects. He eventually saw one and submitted his finding to the Harvard College Observatory. After an 11-year-old English girl named the new planet (for the Roman god of the underworld), we started including Pluto as a planet in our solar system.

But in 2003, an astronomer found a larger object beyond Pluto which he named Eris, according to NASA. The new information caused a bunch of other astronomers to question what really makes a planet a planet, and they decided, based on size and location, that Pluto just didn’t make the cut. Neither did Eris, actually. Pluto was demoted to a dwarf planet.

Needless to say,elementary schools kids were pretty bummed.

But there may be hope. Researchers have recently been debating whether tomake Pluto a planet again.’

THEN: Diamond is the hardest substance

Wikimedia Commons

NOW: Ultrahard nanotwinned cubic boron nitride is the hardest substance

We’ve known about two substances harder than a diamond since 2009: wurtzite boron nitride and lonsdaleite, according to Scientific American. The first resists indentation with 18% more fortitude than a diamond, and the second a whopping 58%.

Unfortunately, both substances are rather unusual and unstable in nature. In fact, the study’s authors, published in the journal “Physical Review Letters,” only calculated the new substances’ hardness, instead of actually testing it using a tangible specimen. That makes the discovery a bit theoretical.

But another contender was published in the January 2013 issue of the journal Nature. In the simplest terms, researchers compressed boron nitride particles to form “ultrahard nanotwinned cubic boron nitride.” They simply re-organized the particles like an onion, or a flaky rose, or those little Russian dolls that fit inside one another, as the team explained to Wired.

As a result, expect women everywhere to startasking for ultrahard nanotwinned cubic boron nitride engagement rings. Because those really areforever.

THEN: Witches in Salem were burned at the stake

Wikimedia Commons

NOW: They were actually hanged

Even if you didn’t read Arthur Miller’s”The Crucible“in high school, you probably learned somewhere that the townspeople of Salem burned witches at the stake.

But that never happened, according to Richard Trask, a town archivist for Danvers (formerly known as Salem Village.) He also chaired the Salem Village Witchcraft Tercentennial Committee from 1990 to 1992 and wrote a book detailing the time period called”Salem Village Witch Hysteria.”

At the time of the trials, New England still followed English law, which listed witchcraft as a felony punishable by hanging not burning at the stake, Trask said. In Europe, however, the church labeled witchcraft heresy and did tie up suspected practitioners and light them on fire. You can see where the confusion started.

THEN: Israelite slaves built the pyramids

NOW: Egyptians workers built the pyramids themselves

Even movies like “The Prince Of Egypt” perpetuate the idea that slaves built the pyramids. Although many think the Bible tells us they did, the book doesn’t mention the story specifically.

This popular myth reportedly stems from comments made by former Israeli Prime Minister Menachem Begin when visiting Egypt in 1977, according to Amihai Mazar, professor at the Hebrew University of Jerusalem.

“No Jews built the pyramids because Jews didn’t exist at the period when the pyramids were built,” Mazar told the AP.

Recent archaeological finds actually show that Egyptians built the pyramids themselves. Workers were recruited from poor families in the north and south but were highly respected, earning crypts near the pyramids and even proper preparation for burial.

Slaves wouldn’t have been treated so honorably.

THEN: Folding a piece of paper more than seven times is mathematically impossible.


NOW: The record stands at 13.

Whether in art class or science, this rumor definitely spread among the masses. But Britney Gallivan, a California high school student, didn’t bite.

She, with some volunteers, bought a giant, $85 roll oftoilet paperand proceeded to blow everyone’s mind by folding it a surprising 11 times. She realized everyone else who tried had been alternating folding directions, and even developed an equation, based on the thickness and width of the specific paper, explaining why you shouldn’t.

Gallivan was a keynote speaker at the 2006 National Council of Teachers of Mathematics convention. She graduated from the University of California, Berkeley with a degree in Environmental Science in 2007. And since then, she’s appeared on MythBusters.

In 2012, students atSt. Mark’s Schoolin Southborough, Massachusetts, brokeGallivan’s record, folding paper 13 times.

THEN: The Great Wall Of China is the only man-made structure visible from space.

NOW: Many man-made places are visible from space.

Technically, this wasn’t ever a solid “truth” just a fact third-graders ubiquitously included in their class reports and diorama presentations. In fact, rumors that you can see the landmark, not only from a spaceship, but all the way from the moon, date back as far as 1938.

In 2003 though, the first Chinese astronaut finally shattered the myth.

The party responsible, a man named Yang Liwei, admitted he couldn’t see the Great Wall from space, according to NASA.

Other photos surfaced here and there. The consensus became that you can, indeed, catch glimpses of the Wall but only under the right conditions (snow on the structure) or with a zoom-capable camera. You can also see the lights of large cities and major roadways and bridges and airports and dams and reservoirs.

The moon factoid, however, is totally wrong.

“The only thing you can see from the Moon is a beautiful sphere, mostly white, some blue and patches of yellow, and every once in a while some green vegetation,” Apollo 12 astronaut Alan Bean told NASA. “No man-made object is visible at this scale.”

To further clarify, people probably mean these structures are visible from satellites orbiting Earth but that’s not actual space.

THEN: Five (or three) kingdoms of classification exist.

Flickr/Tony Higsett

NOW: There might be as many as eight kingdoms.

Depending when you grew up, your middle school science teacher probably lectured about three main kingdoms of life animals, plants, and bacteria (monera) or five, including fungi and protists, too.

Either way, we’ve expanded our classification of life since then.

The more species we find and analyze, the more complex labeling life becomes. In addition to the five kingdoms above, we now know of archaea, previously thrown under monera. Archaea superficially look like other one-celled organisms called eubacteria, but they’re completely different.

Even larger systems exist which further divide eubacteria into two more kingdoms or separate chromista from all the other protists.

In the U.S., however, we stick with six: plants, animals, protists, fungi, archaebacteria, and eubacteria.

Christina Sterbenz contributed to a previous version of this story

Read the original article on Business Insider.Follow us on Facebook and Twitter. Copyright 2017.

Read next: 17 jokes that only smart people will really appreciate

Read more:

Technorati Tags: , , ,

UK schoolboy corrects Nasa data error – BBC News

Media captionMiles Soloman tells Radio 4’s World At One how he discovered something the Nasa experts missed

A British teenager has contacted scientists at Nasa to point out an error in a set of their own data.

A-level student Miles Soloman found that radiation sensors on the International Space Station (ISS) were recording false data.

The 17-year-old from Tapton school in Sheffield said it was “pretty cool” to email the space agency.

The correction was said to be “appreciated” by Nasa, which invited him to help analyse the problem.

“What we got given was a lot of spreadsheets, which is a lot more interesting than it sounds,” Miles told BBC Radio 4’s World at One programme.

The research was part of the TimPix project from the Institute for Research in Schools (IRIS), which gives students across the UK the chance to work on data from the space station, looking for anomalies and patterns that might lead to further discoveries.

During UK astronaut Tim Peake’s stay on the station, detectors began recording the radiation levels on the ISS.

“I went straight to the bottom of the list and I went for the lowest bits of energy there were,” Miles explained.

Miles’s teacher and head of physics, James O’Neill, said: “We were all discussing the data but he just suddenly perked up in one of the sessions and went ‘why does it say there’s -1 energy here?'”

What Miles had noticed was that when nothing hit the detector, a negative reading was being recorded.

But you cannot get negative energy. So Miles and Mr O’Neill contacted Nasa.

“It’s pretty cool”, Miles said. “You can tell your friends, I just emailed Nasa and they’re looking at the graphs that I’ve made.”

It turned out that Miles had noticed something no-one else had – including the Nasa experts.

Nasa said it was aware of the error, but believed it was only happening once or twice a year.

Miles had found it was actually happening multiple times a day.

Image copyright NASA

Prof Larry Pinksy, from the University of Houston, told Radio 4: “My colleagues at Nasa thought they had cleaned that up.

“This underscores – I think – one of the values of the IRIS projects in all fields with big data. I’m sure there are interesting things the students can find that professionals don’t have time to do.”

The professor – who works with Nasa on radiation monitors – said the correction was “appreciated more so than it being embarrassing”.

What do Miles’ friends think of his discovery?

“They obviously think I’m a nerd,” the sixth-former said. “It’s really a mixture of jealousy and boredom when I tell them all the details.”

He added: “I’m not trying to prove Nasa wrong. I want to work with them and learn from them.”

The director of IRIS, Prof Becky Parker, said this sort of “expansion of real science in the classroom” could attract more young people to STEM subjects (science, technology, engineering, mathematics).

She added: “IRIS brings real scientific research into the hands of students no matter their background or the context of the school. The experience inspires them to become the next generation of scientists.”

Read more:

Technorati Tags: , , ,

Pupils need internet lessons to thrive online, say Lords – BBC News

Image copyright Thinkstock

Learning to survive in a world dominated by the internet should be as important for children as reading and writing, says a House of Lords report.

Lessons about online responsibilities, risks and acceptable behaviour should be mandatory in all UK schools, the Lords Communications Committee argues.

The internet is “hugely beneficial” but children need awareness of its hazards, said committee chairman Lord Best.

Industry leaders said education was key to keeping children safe online.

The Lords report builds on findings by the Children’s Commissioner for England in January that the internet is not designed for children, despite them being the biggest users by age group.

“Children inhabit a world in which every aspect of their lives is mediated through technology: from health to education, from socialising to entertainment.

“Yet the recognition that children have different needs to those of adults has not yet been fully accepted in the online world,” say the Lords.

Fake news

Lord Best added: “There is a lot of material which makes the internet harmful but it can also be hugely beneficial – a way for children to interact and find out about the world.”

However, they need to cope with online pornography, internet grooming, sexting and body image issues, he said, as well as building resilience to the addictive properties of internet games which are “designed and developed to keep users online, missing out on sleep as they stay in their bedrooms glued to the screen”.

Children also need to be aware of the dangers of fake news and covert advertising online, he added.

The report argues that “digital literacy should be the fourth pillar of a child’s education alongside reading, writing and mathematics and be resourced and taught accordingly”.

It should form the core of a new curriculum for personal social health and economic education, it adds.

It backs the government’s move to make sex and relationships education statutory in England but says PSHE should also be mandatory in all schools, with the subject included in inspections.

Image copyright Thinkstock
Image caption Too many teens miss out on sleep as they stay online ‘glued to the screen’ said Lord Best

The report notes “a worrying rise in unhappy and anxious children emerging alongside the upward trend of childhood internet use” and calls for more robust research into a “possible causal relationship” alongside immediate action to prevent children being affected.

Overall, the report says the internet should “do more to promote children’s best interests” but found self regulation by industry was “failing” and that commercial interests “very often” took priority.

Meanwhile, it adds, government responsibility is “fragmented” with little co-ordinated policy and joined-up action.

Other recommendations include:

  • Content control filters and privacy settings to be “on” by default for all customers
  • All online businesses to respond quickly to requests by children to remove content
  • A children’s digital champion to be appointed to argue for their rights at the highest levels of government
  • An industry summit, chaired by the prime minister, on redesigning the internet to serve children better

“This issue is of such critical importance for our children that the government, civil society and all those in the internet value chain must work together to improve the opportunities and support where the end user is a child,” the Lords conclude.

The Internet Services Providers Association rejected calls for stronger regulation, while backing the report’s call for better education.

James Blessing, who chairs the ISPA, said that the UK was regarded as a world leader in keeping children safe online “through a self-regulatory approach”.

“We believe the most effective response is a joint approach based on education, raising awareness and technical tools,” he said.

The government said it wanted to make the UK the safest place in the world for young people to go online.

“Ministers have begun work on a new internet safety strategy that will help make this a reality, and we will carefully consider the recommendations included in the Lords Communications Committee Report as part of this process,” said a spokesman.

Related Topics

Read more:

Technorati Tags: , , ,

Celebrate Women’s History Month With 20 Women In Science Who Changed The World

Simply put, women are amazing.

Not only do they create and give birth to tiny humans, but they’ve proven throughout history that they’re pretty damn brave and intelligent, too. They’ve had to fight for their rights and prove themselves worthy over and over again, which is why they’re honored every March. And what better way to celebrate Women’s History Month than to introduce you to women who’ve made scientific advancements in the past as well as those who are changing the way we live now?

Get ready to burst with pride, ladies, because these 20 women in science are too good for this world.

1. Miriam Daniel Mann, who had a chemistry degree with a minor in mathematics, got a job at the National Advisory Committee for Aeronautics (NACA), NASA’s predecessor, as a human computer in the 1940s. Her work helped put astronaut John Glenn into orbit, but she also fought against segregation at NASA.

2. Cynthia Kenyon, a molecular biologist and geneticist, is currently working with a team of scientists at Calico, Google’s nascent biotechnology company, to find ways to slow aging and prevent age-related diseases. In 1993, she discovered that altering a single gene in roundworms could double their lifespan. She’s been able to discover which genes help us live longer and has a goal of extending human lives by 100 years.

Read More: These Powerful Photos Of Women Making History Are Incredibly Inspiring

3. Marine biologist and writer Rachel Carson published a book called, “Silent Spring” in 1962, which warned about the danger of pesticides. It had a huge historical impact and led to lethal pesticides being banned in the U.S.

5. Social psychologist Jennifer Eberhardt studies the way people profile others based on race, specifically when it comes to law enforcement officials. She works with police to help create better policies and build better relationships with the communities they serve.

6. Anthropologist and primatologist Dian Fossey studied and developed close contacts with the mountain gorillas of the Virunga Volcano region of Rwanda. Her studies are credited for providing the basis of our understanding of the behavior and social life of gorillas.

7. Nina Tandon, CEO and cofounder of EpiBone, is using people’s stem cells to grow human bone that can be used to repair bone loss and other defects.

9. Chemist and pharmacologist Gertrude B. Elion was hired by Burroughs Wellcome pharmaceuticals in 1944 and developed 6-mercaplopurine, a drug used in chemotherapy to treat children with leukemia. Among her other contributions, she developed azathioprine, which helps prevent rejection after organ transplant surgery.

10. Mae Carol Jemison was the first black woman to travel to space in the Space Shuttle Endeavor on September 12, 1992.

11. Cecilia Helena Payne-Gaposchkin not only became the first person to earn a Ph.D. in astronomy from Radcliffe, but she also determined in 1925 that hydrogen and helium were the most abundant elements in stars.

12. Jane Goodall is credited for revolutionizing the field of primatology with her decades spent observing and studying the behavior of the Gombe chimpanzees in Tanzania.

13. Mathematician Grace Murray Hopperhelped program the first computers in 1944, contributed to furthering software development concepts, and invented the first compiler for a computer programming language.

14. Computer scientist Ada Lovelace could be considered as the world’s first computer programmer. The notes she made about Charles Babbage’s proposed calculating machines in the 1800s are nowrecognized as early models for computers and software. The programming language “Ada” was named after her.

15. Judith Resnik was one of the first women to enter the U.S. space program and became the second American woman ever to fly in space in 1984. Unfortunately, she was one of the people who tragically died when the space shuttle Challenger exploded.

16. Margaret Mead was the first anthropologist to study human development from a cross-cultural perspective in America, Samoa, Bali, and New Guinea. She was also a leader of the women’s movement during the 1960s.

18. Rosalyn Sussman Yalow helped develop the the radioimmunoassay (RIA) technique, which uses radioisotopes to measure levels of hormones in the human blood system. This made it possible to screen donor blood for diseases like hepatitis and it can determine conditions like hypothyroidism in infants.

(via Eastern Illinois University and Biography and Business Insider)

Read More: In Honor Of International Women’s Day, Here Are Some Incredible Women To Watch

19. Emmy Noether is known as a pioneer in the field of abstract algebra who worked during the early 1900s. She developed the theories of rings, fields, and algebras, and even explained the connection between symmetry and conservation laws. Albert Einstein himself described her as the most important woman in the history of mathematics.

20. Seismologist and geophysicist Inge Lehmann discovered that the Earth has a solid inner core inside a molten outer core in 1936, when it was believed that Earth’s core was a single molten sphere.

Read more:

Technorati Tags: , , ,

Forget health care — this startup offers cryonic freezing as an employee benefit

Generous employee perks are as much a part of the tech industry as long work hours, office Nerf gun battles, and people overusing the word disruption. But while most firms only go so far as free meals, on-site yoga classes, and maybe the occasional indoor climbing wall, an artificial intelligence-driven hedge fund is taking things to the next level.

The good news? Numerai‘snew employee benefit is — quite literally — the coolest one we have heard about. The bad news? You wont be able to enjoy it until youre dead.

We are allowing employees cryonic body preservation as a benefit, Richard Craib, founder of Numerai, told Digital Trends. Employees sign up through a life insurance policy and upon legal death, the life insurance claim is handed over to cryonics provider Alcor.

While the idea of whole-body preservation cryonics being a benefit isnt necessarily going to appeal to everyone, the hope is that it will appeal to the right kind of people, who will have something to bring to Numerai. That means folks with an interest (and, preferably, plenty of impressive qualifications) in artificial intelligence. Strong education backgrounds in mathematics and statistics are also advantageous, Craib continued.

Read more:

Technorati Tags: ,