Tag: Technology

Researchers share $22m Breakthrough prize as science gets rock star treatment

Glitzy ceremony honours work including that on mapping post-big bang primordial light, cell biology, plant science and neurodegenerative diseases

The most glitzy event on the scientific calendar took place on Sunday night when the Breakthrough Foundation gave away $22m (16.3m) in prizes to dozens of physicists, biologists and mathematicians at a ceremony in Silicon Valley.

The winners this year include five researchers who won $3m (2.2m) each for their work on cell biology, plant science and neurodegenerative diseases, two mathematicians, and a team of 27 physicists who mapped the primordial light that warmed the universe moments after the big bang 13.8 billion years ago.

Now in their sixth year, the Breakthrough prizes are backed by Yuri Milner, a Silicon Valley tech investor, Mark Zuckerberg of Facebook and his wife Priscilla Chan, Anne Wojcicki from the DNA testing company 23andMe, and Googles Sergey Brin. Launched by Milner in 2012, the awards aim to make rock stars of scientists and raise their profile in the public consciousness.

The annual ceremony at Nasas Ames Research Center in California provides a rare opportunity for some of the worlds leading minds to rub shoulders with celebrities, who this year included Morgan Freeman as host, fellow actors Kerry Washington and Mila Kunis, and Miss USA 2017 Kra McCullough. When Joe Polchinski at the University of California in Santa Barbara shared the physics prize last year, he conceded his nieces and nephews would know more about the A-list attendees than he would.

Oxford University geneticist Kim Nasmyth won for his work on chromosomes but said he had not worked out what to do with the windfall. Its a wonderful bonus, but not something you expect, he said. Its a huge amount of money, I havent had time to think it through. On being recognised for what amounts to his lifes work, he added: You have to do science because you want to know, not because you want to get recognition. If you do what it takes to please other people, youll lose your moral compass. Nasmyth has won lucrative awards before and channelled some of his winnings into Gregor Mendels former monastery in Brno.

Another life sciences prizewinner, Joanne Chory at the Salk Institute in San Diego, was honoured for three decades of painstaking research into the genetic programs that flip into action when plants find themselves plunged into shade. Her work revealed that plants can sense when a nearby competitor is about to steal their light, sparking a growth spurt in response. The plants detect threatening neighbours by sensing a surge in the particular wavelengths of red light that are given off by vegetation.

Chory now has ambitious plans to breed plants that can suck vast quantities of carbon dioxide out of the atmosphere in a bid to combat climate change. She believes that crops could be selected to absorb 20 times more of the greenhouse gas than they do today, and convert it into suberin, a waxy material found in roots and bark that breaks down incredibly slowly in soil. If we can do this on 5% of the landmass people are growing crops on, we can take out 50% of global human emissions, she said.

Three other life sciences prizes went to Kazutoshi Mori at Kyoto University and Peter Walter for their work on quality control mechanisms that keep cells healthy, and to Don Cleveland at the University of California, San Diego, for his research on motor neurone disease.

The $3m Breakthrough prize in mathematics was shared by two British-born mathematicians, Christopher Hacon at the University of Utah and James McKernan at the University of California in San Diego. The pair made major contributions to a field of mathematics known as birational algebraic geometry, which sets the rules for projecting abstract objects with more than 1,000 dimensions onto lower-dimensional surfaces. It gets very technical, very quickly, said McKernan.

Speaking before the ceremony, Hacon was feeling a little unnerved. Its really not a mathematician kind of thing, but Ill probably survive, he said. Ive got a tux ready, but Im not keen on wearing it. Asked what he might do with his share of the winnings, Hacon was nothing if not realistic. Ill start by paying taxes, he said. And I have six kids, so the rest will evaporate.

Chuck Bennett, an astrophysicist at Johns Hopkins University in Baltimore, led a Nasa mission known as the Wilkinson Microwave Anisotropy Probe (WMAP) to map the faint afterglow of the big bangs radiation that now permeates the universe. The achievement, now more than a decade old, won the 27-strong science team the $3m Breakthrough prize in fundamental physics. When we made our first maps of the sky, I thought these are beautiful, Bennett told the Guardian. It is still absolutely amazing to me. We can look directly back in time.

Bennett believes that the prizes may help raise the profile of science at a time when it is sorely needed. The point is not to make rock stars of us, but of the science itself, he said. I dont think people realise how big a role science plays in their lives. In everything you do, from the moment you wake up to the moment you go to sleep, theres something about what youre doing that involves scientific advances. I dont think people think about that at all.

Read more: https://www.theguardian.com/science/2017/dec/04/researchers-share-22m-breakthrough-prize-as-science-gets-rock-star-treatment

Technorati Tags: , , , , , , , , ,

What should I teach my children to prepare them to race with the robots?

I must prepare my sons to adapt to the fourth industrial revolution but that means sending them to schools that are equipped to exceed the averages

Years ago, as a reporter in Seattle, I watched Microsoft CEO Steve Ballmer decry Washington states education system. He said Microsoft couldnt hire enough locals because our schools dont produce the kinds of minds he needed.

At the time, I was angry. He and his cohort, most notably Jeff Bezos of Amazon, contributed serious money to the campaign against a state income tax on the wealthy that would have funneled billions to our schools. Now I feel a pinch deep in my stomach, an emotion so primal I hesitate to name it.

As a mother, my time is come, or nearly done, and my childrens just begun.

Automation will absorb all of the jobs it can reach, whether on the factory floor or in an office. Artificial intelligence has already taken over the corporate earnings analyses I once produced as a business journalist. By the best measures Ive been able to find, machines will displace about half of American jobs by the time my toddlers look for work.

This new era has been called the second machine age, the fourth industrial revolution, the information economy.

From certain angles, Seattle residents seem well positioned to access the highly paid and creative jobs that arise from combining cutting-edge technologies with the exponential powers of computing and big data. My city is now considered a global city not because of the port, which put our state on the maps when they were still being drawn, but because of the presence of Microsoft, Amazon and numerous tech startups.

Amazon occupies one fifth of all office space in downtown Seattle, a short ride from my neighborhood on light rail. Incoming waves of well-educated tech workers have helped double the median home price during the past five years.

Many of these rich young people call themselves progressive. Are they proud to be joining the nations most regressive tax structure? In our state, poor people pay eight times as much of their family income to taxes as the wealthy 1%. Lacking a personal income tax, Washington state relies on sales tax and has long looked to levies to fund schools, parks and other social needs.

When I moved to Seattle in 2004, I marveled that the state didnt take a cut of my income from the now-defunct Seattle Post-Intelligencer. It took me a while to contemplate what it means for an entire society to act against the interests of its children.

College-level tuitions before college

To survive the extinction of an entire class, I must prepare my two- and three-year-old sons to race with the robots, and not against them.

Our kids are going to meet an economy with far fewer entry-level positions and will have to clamber up a receding ladder. That means being in schools equipped to exceed the averages, not rising to meet them.

Washington state has underfunded our schools so long that our governments negligence was deemed unconstitutional by our state supreme court, which fined the state $100,000 a day for failing to provide a future for our children.

Years into this public shaming, the legislature came up with a multibillion-dollar package to fund basic education in our state, though they didnt manage to pass a capital budget before students went back to school after a long, dry summer.

Amazon
Amazon Go opens to Amazon employees in its Beta program in Seattle. Photograph: Paul Gordon/Zuma Press / eyevine

From my porch, I can see the chain-link fence blur into gray around the asphalt playground of our neighborhood public school. On weekday mornings, my closest friends walk to Hawthorne Elementary with their children, ducklings that cluster at crosswalks along streets known for gunfire. A new home just sold for nearly a million dollars at the end of our block, but people keep getting shot and dying at our community playfield.

Despite valiant efforts by its admirable principal, committed educators, engaged parents and resilient students, Hawthorne has been labeled failing since long before my husband and I bought a peeling house from a nice couple who raised their family here.

Less than half of the schools fourth and fifth graders meet the states standards in math, which makes me doubt that our educational system is preparing these kids to thrive in the glittering economy they were born under. Five years ago, the office of the superintendent of public instruction ranked Hawthorne among the bottom 5% of the state, according to test passage rates.

This, in a city known for minting billionaires.

In The Second Machine Age, authors Erik Brynjolfsson and Andrew McAfee, both MIT professors, recommend Montessori programs to prepare children for their future, with a focus on science, technology, engineering, arts and math. Thats Steam, for those not versed in educational acronyms.

Developed to help poor children realize their own innate potential, Montessori schools practice self-directed learning with tactile materials that encourage the freewheeling creativity that formed tech CEOs such as Bezos and Googles co-founders.

The private bilingual Montessori kindergarten I found 30 minutes away costs $20,000 a year.

Despite college-level tuitions, about one quarter of Seattle students opt out of the public school system to study at private or parochial schools. To send my sons to Seattles best private schools would cost more than $700,000, and thats before they get to college.

A survey of public schools in Seattle shows no Montessori options that my children can access, though a nearby program in Leschi was a success at first, drawing wealthier students into the public school system, bringing with them the engagement of their families.

The Leschi teachers were so distressed by the resulting racial, linguistic and housing disparities between the traditional and Montessori classes that they melded the programs, rather than working to recruit more students of color into the Montessori program, which they could not afford to expand. A taskforce opted against including technology in the curriculum, fearful they would attract too many white families.

I believe in diversity; my own blood is blended. A first generation Latinx, Ive invested years of effort to raise my sons to be bilingual. I also want to work toward equity in a city whose neighborhood schools reflect the segregation compelled by redlining and white flight.

Leschis students are learning hard truths about equity, but theyre improving together. Maybe thats enough. But I worry when well-intentioned people lacking the resources to serve their students equally decide against teaching technology, the lingua franca of our world. Even the state administers student tests by computer.

I sought answers from Chris Reykdal, state superintendent of public instruction. The injustice of it all is that we have never seen technology as a core learning, Reykdal said. Do we still consider technology an enrichment, or should it be a more profound part of basic education? The state hasnt made that decision yet.

Washington has hundreds of school districts overseen by elected boards that enact tangled mandates without the resources to see them through. All over the state, schools used levy monies to take care of basics and pay their teachers, rather than acquiring and teaching technology.

Deb Merle is Governor Jay Inslees K-12 education adviser. Merle said that designating technology as part of basic education, which would ensure that the dollars flowed to their purpose, is not a state priority, though she recognized that Reykdals predecessor also advocated for keeping technology funds separate.

I dont think we teach enough science, period. Thats what I spend a lot of time worrying about, not what kind of science, Merle said. Our elementary schools teach less than one hour per week of science.

Steam as a social justice issue

I kept dialing, determined to maintain the education-fueled trajectory of my family.

My kin have lived in dictatorship-induced diaspora since famine swept Spain under Franco; they later fled Batista, who ruled Cuba before Castro. I am not conditioned to expect social stability as a condition of being for any country.

The meeting I most dreaded was closest to home. On the short walk to our neighborhood school, I decided to come right out and tell its principal, Sandra Scott, that I am afraid to send my kids to Hawthorne because the schools test scores, though on the rise, are low enough to make me wince.

Luckily, Scott is a pragmatic visionary, the kind of principal who inspires parents to put down the remote and join the PTA. Since 2009, Scott has led Hawthornes revitalization, winning admiration and awards from Johns Hopkins University for her program of school, family and community partnerships.

Test scores dont define who the students are. Our kids are not a number, Scott said. There were things we needed to do differently or better like improving the academics and the school culture to bring families back into the community.

To
To face the age of automation, it is recommended children are taught a program with a focus on science, technology, engineering, arts and math. Photograph: Will Walker / NNP

Recognizing the opportunity that Seattles tech economy presents, Scott retooled Hawthorne to focus on Steam programming. Rather than cluster the high-performing test takers together which has segregated programs within diverse schools Hawthorne distributes them throughout classrooms. If a student excels in math, outstripping peers in that grades curriculum, the teacher walks that child to the next grade for math.

When it comes to fifth-grade science, those efforts more than doubled the test passage rates over three years, from 20% to 46%. I ache upon rereading that last sentence the hope and pride in the increase, the grimace I cant help but make at where they started, and what remains to be accomplished.

Scott and her staff find ways to make progress. But she doesnt have the funds for a technology teacher or trainings, so the lab will be largely unused this year. As a mother who cares about the kids who go to Hawthorne, I cant afford to wait for someone else to find those resources.

The leaders of this school are working to undo the effects of intergenerational poverty that dates back to slavery and other forced migrations. More than half of the students are eligible for free and reduced lunches. A quarter of the students are learning the language theyre taught in. Scores reflect circumstances, which is why Reykdal is refocusing the state on racial gaps, poverty gaps and English language gaps, down to the school level.

Many of the jobs first displaced by automation belong to peoples of color, women and others who depend on a combination of part-time positions. A federal council of economic advisers found an 83% likelihood that, by 2040, automation would displace jobs paying less than $20 per hour.

In Washington, Steam-related jobs pay double the median wage, for starters. The people moving here to work for Microsoft, Amazon and Boeing make much more. When we choose not to provide public schools with the resources needed to provide educational access to those opportunities, we are consigning local students to lesser-paid sectors of the economy, the very same that are vulnerable to automation. In other words, we are allowing our government to consecrate our children to poverty in real time.

Mass unemployment would make American society more violent, our law enforcement more brutal and our peoples more vulnerable to genocide. Automation is a social justice issue, and if history is any teacher, it shows us that vast swaths of disenfranchised peoples are a harbinger of war.

Problems that reflect the world

Whenever I have a problem thats too big to solve, I call my dad, and we argue about what to do. He told me the solution was simple. I should move. The only financially feasible choice would be the suburbs.

Something in me balks at leaving a city I love, and especially our neighborhood, where my children are happy. As a community, we just celebrated our 10th annual block party, a Cuban pig roast that my husband and I organize for our wedding anniversary. Our neighbors come bearing side dishes, canopies and games, and we dance until the DJs stop playing. The conversations we start on that night have lasted a decade. I want to stay.

As native Spanish speakers, my sons could option into the bilingual public schools on the other side of our gridlocked downtown, north of the covenants which kept people of color from buying homes. Those schools wait lists are legendary, but I am uncomfortable with the mostly white and relatively well-off demographics produced by saving only 15% of seats for native speakers. I want my kids to feel at home in a country that contains multitudes, which is why we moved to one of our nations most diverse zip codes.

Computers solve the problems theyre given. And so we must ask ourselves what we value, and whom.

Not every child wants to be a robotics engineer. But without the modes of thought elicited by learning computer science from an early age, many Washington state students will not be competitive for the jobs that remain. I want my own sons to be chosen and better yet, able to choose as I was, though I fell for a profession whose financial structures imploded five years after my college graduation.

I hope my privileged vulnerability encourages you to reflect on those truly trapped by our system. This essay invokes my worries as a mother, and with them, my socioeconomic position. Hawthorne is a happy place with diverse classrooms whose problems reflect the world, but I am glad of the years I have left to decide what my kids truly need to learn.

There can be no denying that I am one of the gentrifiers of this neighborhood, and with the honor of living here comes the responsibility to contribute. Looking at whats coming in the second machine age tremendous opportunities, to be sure, but also massive loss of what weve known as jobs I feel compelled to join those working toward a better future, minds whirring whenever problems arise.

Two nonprofits, FIRST Washington and XBOT Robotics, have offered support and equipment for Hawthorne to start a Lego robotics league after school. Four parents signed up to lead teams during last nights PTA meeting, my very first.

Its a start.

Get involved

To bolster Steam education for students, hybridized systems have sprung up as non-profits seek to prepare our children for the economy we will leave to them.

First Washington: This nonprofit helps start and sustain after-school Lego robotics leagues from K-12.

XBOT Robotics: Operating in one of the nations most diverse zip codes, offering robotics programming K-12.

Code.org: Free online programming for learners at all levels. Work through problems with your kids.

Technology Access Foundation: Helping people of color access Stem-related education in middle school, high school and beyond.

Washington State Opportunity Scholarship: A non-profit that funds thousands of Stem scholarships for Washingtons college-bound high school graduates. More than half of those scholarship recipients are students of color, women and/or the first in their family to access a higher education, if not all three.

Teals (Technology, Education and Literacy in Schools): Matches professionals with teachers to co-teach computer science in classrooms.

Seattle Mesa (Mathematics Engineering Science Achievement): Provides scholarships, in-class math and science projects, advanced learning opportunities, tutoring, math camp and teacher trainings.

Read more: https://www.theguardian.com/education/2017/oct/18/what-should-i-teach-my-children-to-prepare-them-for-jobs-in-their-era

Technorati Tags: , , , , , ,

Breaking the code: how women in Nigeria are changing the face of tech

Female developers are emerging as influential forces in the countrys booming technology sector but the stigma persists that computing is a male industry

The Nigerian tech scene is booming. Last year, Lagos-based startup Andela received $24m (18.5m) in funding from Mark Zuckerberg. In 2015, financial technology startup Paystack one of the first Nigerian tech companies to be accepted into renowned California-based startup accelerator Y Combinator secured approximately $1.3m in seed investment from international investors.

Within this growth, women are emerging as influential forces, and changing the face of technology in Africa, especially in the fields of agricultural and financial tech. This is despite the fact that, as recently as a decade ago, women were grossly underrepresented in and excluded from the industries they are now helping to shape.

I think those who are joining the tech world today have an easier path to tread, says Nnenna Nwakanma, a Nigerian activist for accessible internet. There were situations where people would refuse to recognise my authority, but would patronise or objectify me, or refuse to fulfil contracts they had willingly entered into all because of my gender. Despite this, Nwakanma co-founded the Free Software and Open Source Foundation for Africa (FOSSFA) and is now a senior policy manager for the World Wide Web Foundation, where she supports digital equality and promotes the rights of Nigerian women online.

The negative attitude towards womens involvement in science, technology, engineering and mathematics (Stem) is starting to change, thanks partly to initiatives such as the Stem outreach and mentoring programmes established by the Working to Advance Science and Technology Education for African Women (WAAW) Foundation, which operates in 11 countries. There is also Intels programme She Will Connect Africa, which has trained more than 150,000 women in Nigeria, South Africa and Kenya in digital literacy since it launched in 2013.

The demand for tech talent is now such that it cannot be met by men alone. Rapid digitalisation in Nigeria is heavily concentrated in the countrys metropolitan megacity, Lagos. Here, the startup culture flourishes, while big business have moved in: in 2015, global tech supplier Bosch opened a subsidiary in Ikeja, the capital of Lagos region, and Microsoft has an office in the affluent Lagos neighbourhood of Ikoyi.

Ire Aderinokun the author of web development blog bitsofco.de, a front-end developer and Nigerias first female Google Developer Expert says her love of tech started as a hobby. I used to play an online game called Neopets, which had some HTML capabilities. From there, I got really interested and continued to learn more. But, despite Aderinokuns enthusiasm, her interest was not always encouraged. Its definitely not what society expected of me. I studied psychology for my undergraduate and law for my masters. When I said I wanted to pursue this, there were many people who told me not to.

Rukayat Sadiq, a software engineer and a technical team leader at Andela, also faced opposition. She chose to study electrical engineering a subject in which a class of 150 students might include only 15 women to the surprise of friends and family, who had expected her to become a doctor.

While women entering and participating equally in the labour market is commonplace in Nigeria, computing and engineering are still industries dominated heavily by men. But many women who work in the tech industry are keen to offer support to those coming up. Aderinokun, for example, is funding full scholarships to five women for online programming nanodegrees. These qualifications do not guarantee employment, but they give those who have earned them a distinct advantage in the workplace and are endorsed by top employers, including Google, AT&T and Amazon. Sadiq also spends time teaching and mentoring newbies.

Removing the stigma and assumption that tech is only supposed to be for men is necessary, and I think we need to start from as early in childrens lives as possible, says Aderinokun. We should work towards eliminating negative statements and mindsets that perpetuate the myth that women cant be involved in Stem.

It is hopeful that we will one day get to a point where tech-related fields are level playing grounds for both sexes.

It is a challenge that continues around the globe, but it is one Nigeria is well equipped to handle.

Read more: https://www.theguardian.com/lifeandstyle/2017/aug/14/breaking-the-code-how-women-in-nigeria-are-changing-the-face-of-tech

Technorati Tags: , , , , , , ,

Collection of letters by codebreaker Alan Turing found in filing cabinet

The correspondence, dating from 1949 to 1954, was found by an academic in a storeroom at the University of Manchester

A lost collection of nearly 150 letters from the codebreaker Alan Turing has been uncovered in an old filing cabinet at the University of Manchester.

The correspondence, which has not seen the light of day for at least 30 years, contains very little about Turings tortured personal life. It does, however, give an intriguing insight into his views on America.

In response to an invitation to speak at a conference in the US in April 1953, Turing replied that he would rather not attend: I would not like the journey, and I detest America.

The letter, sent to Donald Mackay, a physicist at Kings College London, does not give any further explanation for Turings forthright views on America, nor do these views feature in any of the other 147 letters discovered earlier this year.

The correspondence, dating from early 1949 to Turings death in 1954, was found by chance when an academic cleared out an old filing cabinet in a storeroom at the University of Manchester. Turing was deputy director of the universitys computing laboratory from 1948, after his heroic wartime codebreaking at Bletchley Park.

Turing was a visionary mathematician and is regarded today as the father of modern computing who broke the Nazis second world war Enigma code. While his later life has been overshadowed by his conviction for gross indecency and his death aged 41 from cyanide poisoning, a posthumous pardon was granted by the Queen in 2013. His life was featured in the 2014 film the Imitation Game.

Prof Jim Miles, of the universitys school of computer science, said he was amazed to stumble upon the documents, contained in an ordinary-looking red paper file with Alan Turing scrawled on it.

When I first found it I initially thought: That cant be what I think it is, but a quick inspection showed it was a file of old letters and correspondence by Alan Turing, he said.

I was astonished such a thing had remained hidden out of sight for so long. No one who now works in the school or at the university knew they even existed. It really was an exciting find and it is mystery as to why they had been filed away.

The collection focuses mainly on Turings academic research, including his work on groundbreaking areas in AI, computing and mathematics, and invitations to lecture at some of Americas best-known universities including the Massachusetts Institute of Technology.

It contains a single letter from GCHQ, for whom Turing worked during the war, asking the mathematician in 1952 if he could supply a photograph of himself for an official history of Bletchley Park that was being compiled by the American cryptographer William Friedman. In his reply to Eric Jones, GCHQs then director, Turing said he would send a picture for the American rogues gallery.

The collection also contains a handwritten draft BBC radio programme on artificial intelligence, titled Can machines think? from July 1951. The documents were sorted, catalogued and stored by the University of Manchester archivist James Peters and are now available to search online.

Peters said: This is a truly unique find. Archive material relating to Turing is extremely scarce, so having some of his academic correspondence is a welcome and important addition to our collection.

There is very little in the way of personal correspondence, and no letters from Turing family members. But this still gives us an extremely interesting account and insight into his working practices and academic life whilst he was at the University of Manchester.

He added: The letters mostly confirm what is already known about Turings work at Manchester, but they do add an extra dimension to our understanding of the man himself and his research.

As there is so little actual archive on this period of his life, this is a very important find in that context. There really is nothing else like it.

Read more: https://www.theguardian.com/science/2017/aug/27/collection-letters-codebreaker-alan-turing-found-filing-cabinet

Technorati Tags: , , , , , , ,

Your animal life is over. Machine life has begun. The road to immortality

In California, radical scientists and billionaire backers think the technology to extend life by uploading minds to exist separately from the body is only a few years away

Heres what happens. You are lying on an operating table, fully conscious, but rendered otherwise insensible, otherwise incapable of movement. A humanoid machine appears at your side, bowing to its task with ceremonial formality. With a brisk sequence of motions, the machine removes a large panel of bone from the rear of your cranium, before carefully laying its fingers, fine and delicate as a spiders legs, on the viscid surface of your brain. You may be experiencing some misgivings about the procedure at this point. Put them aside, if you can.

Youre in pretty deep with this thing; theres no backing out now. With their high-resolution microscopic receptors, the machine fingers scan the chemical structure of your brain, transferring the data to a powerful computer on the other side of the operating table. They are sinking further into your cerebral matter now, these fingers, scanning deeper and deeper layers of neurons, building a three-dimensional map of their endlessly complex interrelations, all the while creating code to model this activity in the computers hardware. As thework proceeds, another mechanical appendage less delicate, less careful removes the scanned material to a biological waste container for later disposal. This is material you will no longer be needing.

At some point, you become aware that you are no longer present in your body. You observe with sadness, or horror, or detached curiosity the diminishing spasms of that body on the operating table, the last useless convulsions of a discontinued meat.

The animal life is over now. The machine life has begun.

This, more or less, is the scenario outlined by Hans Moravec, a professor of cognitive robotics at Carnegie Mellon, in his 1988 book Mind Children: The Future of Robot and Human Intelligence. It is Moravecs conviction that the future of the human species will involve a mass-scale desertion of our biological bodies, effected by procedures of this kind. Its a belief shared by many transhumanists, a movement whose aim is to improve our bodies and minds to the point where we become something other and better than the animals we are. Ray Kurzweil, for one, is a prominent advocate of the idea of mind-uploading. An emulation of the human brain running on an electronic system, he writes in The Singularity Is Near, would run much faster than our biological brains. Although human brains benefit from massive parallelism (on the order of 100 trillion interneuronal connections, all potentially operating simultaneously), the rest time of the connections is extremely slow compared to contemporary electronics. The technologies required for such an emulation sufficiently powerful and capacious computers and sufficiently advanced brainscanning techniques will be available, he announces, by the early 2030s.

And this, obviously, is no small claim. We are talking about not just radically extended life spans, but also radically expanded cognitive abilities. We are talking about endless copies and iterations of the self. Having undergone a procedure like this, you would exist to the extent you could meaningfully be said to exist at all as an entity of unbounded possibilities.

I was introduced to Randal Koene at a Bay Area transhumanist conference. He wasnt speaking at the conference, but had come along out of personal interest. A cheerfully reserved man in his early 40s, he spoke in the punctilious staccato of a non-native English speaker who had long mastered the language. As we parted, he handed me his business card and much later that evening Iremoved it from my wallet and had a proper look at it. The card was illustrated with a picture of a laptop, on whose screen was displayed a stylised image of a brain. Underneath was printed what seemed to me an attractively mysterious message: Carboncopies: Realistic Routes to Substrate Independent Minds. Randal A Koene, founder.

I took out my laptop and went to the website of Carboncopies, which I learned was a nonprofit organisation with a goal of advancing the reverse engineering of neural tissue and complete brains, Whole Brain Emulation and development of neuroprostheses that reproduce functions of mind, creating what we call Substrate Independent Minds. This latter term, I read, was the objective to be able to sustain person-specific functions of mind and experience in many different operational substrates besides the biological brain. And this, I further learned, was a process analogous to that by which platform independent code can be compiled and run on many different computing platforms.

It seemed that I had met, without realising it, a person who was actively working toward the kind of brain-uploading scenario that Kurzweil had outlined in The Singularity Is Near. And this was a person I needed to get to know.

Randal
Randal Koene: It wasnt like I was walking into labs, telling people I wanted to upload human minds to computers.

Koene was an affable and precisely eloquent man and his conversation was unusually engaging for someone so forbiddingly intelligent and who worked in so rarefied a field as computational neuroscience; so, in his company, I often found myself momentarily forgetting about the nearly unthinkable implications of the work he was doing, the profound metaphysical weirdness of the things he was explaining to me. Hed be talking about some tangential topic his happily cordial relationship with his ex-wife, say, or the cultural differences between European and American scientific communities and Id remember with a slow, uncanny suffusion of unease that his work, were it to yield the kind of results he is aiming for, would amount to the most significant event since the evolution of Homo sapiens. The odds seemed pretty long from where I was standing, but then again, I reminded myself, the history of science was in many ways an almanac of highly unlikely victories.

One evening in early spring, Koene drove down to San Francisco from the North Bay, where he lived and worked in a rented ranch house surrounded by rabbits, to meet me for dinner in a small Argentinian restaurant on Columbus Avenue. The faint trace of an accent turned out to be Dutch. Koene was born in Groningen and had spent most of his early childhood in Haarlem. His father was a particle physicist and there were frequent moves, including a two-year stint in Winnipeg, as he followed his work from one experimental nuclear facility to the next.

Now a boyish 43, he had lived in California only for the past five years, but had come to think of it as home, or the closest thing to home hed encountered in the course of a nomadic life. And much of this had to do with the culture of techno-progressivism that had spread outward from its concentrated origins in Silicon Valley and come to encompass the entire Bay Area, with its historically high turnover of radical ideas. It had been a while now, he said, since hed described his work to someone, only for them to react as though he were making a misjudged joke or simply to walk off mid-conversation.

In his early teens, Koene began to conceive of the major problem with the human brain in computational terms: it was not, like a computer, readable and rewritable. You couldnt get in there and enhance it, make it run more efficiently, like you could with lines of code. You couldnt just speed up a neuron like you could with a computer processor.

Around this time, he read Arthur C Clarkes The City and the Stars, a novel set a billion years from now, in which the enclosed city of Diaspar is ruled by a superintelligent Central Computer, which creates bodies for the citys posthuman citizens and stores their minds in its memory banks at the end of their lives, for purposes of reincarnation. Koene saw nothing in this idea of reducing human beings to data that seemed to him implausible and felt nothing in himself that prevented him from working to bring it about. His parents encouraged him in this peculiar interest and the scientific prospect of preserving human minds in hardware became a regular topic of dinnertime conversation.

Computational neuroscience, which drew its practitioners not from biology but from the fields of mathematics and physics, seemed to offer the most promising approach to the problem of mapping and uploading the mind. It wasnt until he began using the internet in the mid-1990s, though, that he discovered a loose community of people with an interest in the same area.

As a PhD student in computational neuroscience at Montreals McGill University, Koene was initially cautious about revealing the underlying motivation for his studies, for fear of being taken for a fantasist or an eccentric.

I didnt hide it, as such, he said, but it wasnt like I was walking into labs, telling people I wanted to upload human minds to computers either. Id work with people on some related area, like the encoding of memory, with a view to figuring out how that might fit into an overall road map for whole brain emulation.

Having worked for a while at Halcyon Molecular, a Silicon Valley gene-sequencing and nanotechnology startup funded by Peter Thiel, he decided to stay in the Bay Area and start his own nonprofit company aimed at advancing the cause to which hed long been dedicated: carboncopies

Koenes decision was rooted in the very reason he began pursuing that work in the first place: an anxious awareness of the small and diminishing store of days that remained to him. If hed gone the university route, hed have had to devote most of his time, at least until securing tenure, to projects that were at best tangentially relevant to his central enterprise. The path he had chosen was a difficult one for a scientist and he lived and worked from one small infusion of private funding to the next.

But Silicon Valleys culture of radical techno-optimism had been its own sustaining force for him, and a source of financial backing for a project that took its place within the wildly aspirational ethic of that cultural context. There were people there or thereabouts, wealthy and influential, for whom a future in which human minds might be uploaded to computers was one to be actively sought, a problem to be solved, disruptively innovated, by the application of money.

Transcendence
Brainchild of the movies: in Transcendence (2014), scientist Will Caster, played by Johnny Depp, uploads his mind to a computer program with dangerous results.

One such person was Dmitry Itskov, a 36-year-old Russian tech multimillionaire and founder of the 2045 Initiative, an organisationwhose stated aim was to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier, and extending life, including to the point of immortality. One of Itskovs projects was the creation of avatars artificial humanoid bodies that would be controlled through brain-computer interface, technologies that would be complementary with uploaded minds. He had funded Koenes work with Carboncopies and in 2013 they organised a conference in New York called Global Futures 2045, aimed, according to its promotional blurb, at the discussion of a new evolutionary strategy for humanity.

When we spoke, Koene was working with another tech entrepreneur named Bryan Johnson, who had sold his automated payment company to PayPal a couple of years back for $800m and who now controlled a venture capital concern called the OS Fund, which, I learned from its website, invests in entrepreneurs working towards quantum leap discoveries that promise to rewrite the operating systems of life. This language struck me as strange and unsettling in a way that revealed something crucial about the attitude toward human experience that was spreading outward from its Bay Area centre a cluster of software metaphors that had metastasised into a way of thinking about what it meant to be a human being.

And it was the sameessential metaphor that lay at the heart of Koenes project: the mind as a piece of software, an application running on the platform of flesh. When he used the term emulation, he was using it explicitly to evoke the sense in which a PCs operating system could be emulated on a Mac, as what he called platform independent code.

The relevant science for whole brain emulation is, as youd expect, hideously complicated, and its interpretation deeply ambiguous, but if I can risk a gross oversimplification here, I will say that it is possible to conceive of the idea as something like this: first, you scan the pertinent information in a persons brain the neurons, the endlessly ramifying connections between them, the information-processing activity of which consciousness is seen as a byproduct through whatever technology, or combination of technologies, becomes feasible first (nanobots, electron microscopy, etc). That scan then becomes a blueprint for the reconstruction of the subject brains neural networks, which is then converted into a computational model. Finally, you emulate all of this on a third-party non-flesh-based substrate: some kind of supercomputer or a humanoid machine designed to reproduce and extend the experience of embodiment something, perhaps, like Natasha Vita-Mores Primo Posthuman.

The whole point of substrate independence, as Koene pointed out to me whenever I asked him what it would be like to exist outside of a human body, and I asked him many times, in various ways was that it would be like no one thing, because there would be no one substrate, no one medium of being. This was the concept transhumanists referred to as morphological freedom the liberty to take any bodily form technology permits.

You can be anything you like, as an article about uploading in Extropy magazine put it in the mid-90s. You can be big or small; you can be lighter than air and fly; you can teleport and walk through walls. You can be a lion or an antelope, a frog or a fly, a tree, a pool, the coat of paint on a ceiling.

What really interested me about this idea was not how strange and far-fetched it seemed (though it ticked those boxes resolutely enough), but rather how fundamentally identifiable it was, how universal. When talking to Koene, I was mostly trying to get to grips with the feasibility of the project and with what it was he envisioned as a desirable outcome. But then we would part company I would hang up the call, or I would take my leave and start walking toward the nearest station and I would find myself feeling strangely affected by the whole project, strangely moved.

Because there was something, in the end, paradoxically and definitively human in this desire for liberation from human form. I found myself thinking often of WB Yeatss Sailing to Byzantium, in which the ageing poet writes of his burning to be free of the weakening body, the sickening heart to abandon the dying animal for the manmade and immortal form of a mechanical bird. Once out of nature, he writes, I shall never take/ My bodily form from any natural thing/ But such a form as Grecian goldsmiths make.

One evening, we were sitting outside a combination bar/laundromat/standup comedy venue in Folsom Street a place with the fortuitous name of BrainWash when I confessed that the idea of having my mind uploaded to some technological substrate was deeply unappealing to me, horrifying even. The effects of technology on my life, even now, were something about which I was profoundly ambivalent; for all I had gained in convenience and connectedness, I was increasingly aware of the extent to which my movements in the world were mediated and circumscribed by corporations whose only real interest was in reducing the lives of human beings to data, as a means to further reducing us to profit.

The content we consumed, the people with whom we had romantic encounters, the news we read about the outside world: all these movements were coming increasingly under the influence of unseen algorithms, the creations of these corporations, whose complicity with government, moreover, had come to seem like the great submerged narrative of our time. Given the world we were living in, where the fragile liberal ideal of the autonomous self was already receding like a half-remembered dream into the doubtful haze of history, wouldnt a radical fusion of ourselves with technology amount, in the end, to a final capitulation of the very idea of personhood?

Koene nodded again and took a sip of his beer.

Hearing you say that, he said, makes it clear that theres a major hurdle there for people. Im more comfortable than you are with the idea, but thats because Ive been exposed to it for so long that Ive just got used to it.

Dmitry
Russian billionaire Dmitry Itskov wants to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier. Photograph: Mary Altaffer/AP

In the weeks and months after I returned from San Francisco, I thought obsessively about the idea of whole brain emulation. One morning, I was at home in Dublin, suffering from both a head cold and a hangover. I lay there, idly considering hauling myself out of bed to join my wife and my son, who were in his bedroom next door enjoying a raucous game of Buckaroo. I realised that these conditions (head cold, hangover) had imposed upon me a regime of mild bodily estrangement. As often happens when Im feeling under the weather, I had a sense of myself as an irreducibly biological thing, an assemblage of flesh and blood and gristle. I felt myself to be an organism with blocked nasal passages, a bacteria-ravaged throat, a sorrowful ache deep within its skull, its cephalon. I was aware of my substrate, in short, because my substrate felt like shit.

And I was gripped by a sudden curiosity as to what, precisely, that substrate consisted of, as to what I myself happened, technically speaking, to be. I reached across for the phone on my nightstand and entered into Google the words What is the human… The first three autocomplete suggestions offered What is The Human Centipede about, and then: What is the human body made of, and then: What is the human condition.

It was the second question I wanted answered at this particular time, as perhaps a back door into the third. It turned out that I was 65% oxygen, which is to say that I was mostly air, mostly nothing. After that, I was composed of diminishing quantities of carbon and hydrogen, of calcium and sulphur and chlorine, and so on down the elemental table. I was also mildly surprised to learn that, like the iPhone I was extracting this information from, I also contained trace elements of copper and iron and silicon.

What a piece of work is a man, I thought, what a quintessence of dust.

Some minutes later, my wife entered the bedroom on her hands and knees, our son on her back, gripping the collar of her shirt tight in his little fists. She was making clip-clop noises as she crawled forward, he was laughing giddily and shouting: Dont buck! Dont buck!

With a loud neighing sound, she arched her back and sent him tumbling gently into a row of shoes by the wall and he screamed in delighted outrage, before climbing up again. None of this, I felt, could be rendered in code. None of this, I felt, could be run on any other substrate. Their beauty was bodily, in the most profound sense, in the saddest and most wonderful sense.

I never loved my wife and our little boy more, I realised, than when I thought of them as mammals. I dragged myself, my animal body, out of bed to join them.

To Be a Machine by Mark OConnell is published by Granta (12.99). To order a copy for 11.04 go to bookshop.theguardian.com or call 0330 333 6846. Free UK p&p over 10, online orders only. Phone orders min p&p of 1.99

Read more: https://www.theguardian.com/science/2017/mar/25/animal-life-is-over-machine-life-has-begun-road-to-immortality

Technorati Tags: , , , , , , , , ,

Your animal life is over. Machine life has begun. The road to immortality

In California, radical scientists and billionaire backers think the technology to extend life by uploading minds to exist separately from the body is only a few years away

Heres what happens. You are lying on an operating table, fully conscious, but rendered otherwise insensible, otherwise incapable of movement. A humanoid machine appears at your side, bowing to its task with ceremonial formality. With a brisk sequence of motions, the machine removes a large panel of bone from the rear of your cranium, before carefully laying its fingers, fine and delicate as a spiders legs, on the viscid surface of your brain. You may be experiencing some misgivings about the procedure at this point. Put them aside, if you can.

Youre in pretty deep with this thing; theres no backing out now. With their high-resolution microscopic receptors, the machine fingers scan the chemical structure of your brain, transferring the data to a powerful computer on the other side of the operating table. They are sinking further into your cerebral matter now, these fingers, scanning deeper and deeper layers of neurons, building a three-dimensional map of their endlessly complex interrelations, all the while creating code to model this activity in the computers hardware. As thework proceeds, another mechanical appendage less delicate, less careful removes the scanned material to a biological waste container for later disposal. This is material you will no longer be needing.

At some point, you become aware that you are no longer present in your body. You observe with sadness, or horror, or detached curiosity the diminishing spasms of that body on the operating table, the last useless convulsions of a discontinued meat.

The animal life is over now. The machine life has begun.

This, more or less, is the scenario outlined by Hans Moravec, a professor of cognitive robotics at Carnegie Mellon, in his 1988 book Mind Children: The Future of Robot and Human Intelligence. It is Moravecs conviction that the future of the human species will involve a mass-scale desertion of our biological bodies, effected by procedures of this kind. Its a belief shared by many transhumanists, a movement whose aim is to improve our bodies and minds to the point where we become something other and better than the animals we are. Ray Kurzweil, for one, is a prominent advocate of the idea of mind-uploading. An emulation of the human brain running on an electronic system, he writes in The Singularity Is Near, would run much faster than our biological brains. Although human brains benefit from massive parallelism (on the order of 100 trillion interneuronal connections, all potentially operating simultaneously), the rest time of the connections is extremely slow compared to contemporary electronics. The technologies required for such an emulation sufficiently powerful and capacious computers and sufficiently advanced brainscanning techniques will be available, he announces, by the early 2030s.

And this, obviously, is no small claim. We are talking about not just radically extended life spans, but also radically expanded cognitive abilities. We are talking about endless copies and iterations of the self. Having undergone a procedure like this, you would exist to the extent you could meaningfully be said to exist at all as an entity of unbounded possibilities.

I was introduced to Randal Koene at a Bay Area transhumanist conference. He wasnt speaking at the conference, but had come along out of personal interest. A cheerfully reserved man in his early 40s, he spoke in the punctilious staccato of a non-native English speaker who had long mastered the language. As we parted, he handed me his business card and much later that evening Iremoved it from my wallet and had a proper look at it. The card was illustrated with a picture of a laptop, on whose screen was displayed a stylised image of a brain. Underneath was printed what seemed to me an attractively mysterious message: Carboncopies: Realistic Routes to Substrate Independent Minds. Randal A Koene, founder.

I took out my laptop and went to the website of Carboncopies, which I learned was a nonprofit organisation with a goal of advancing the reverse engineering of neural tissue and complete brains, Whole Brain Emulation and development of neuroprostheses that reproduce functions of mind, creating what we call Substrate Independent Minds. This latter term, I read, was the objective to be able to sustain person-specific functions of mind and experience in many different operational substrates besides the biological brain. And this, I further learned, was a process analogous to that by which platform independent code can be compiled and run on many different computing platforms.

It seemed that I had met, without realising it, a person who was actively working toward the kind of brain-uploading scenario that Kurzweil had outlined in The Singularity Is Near. And this was a person I needed to get to know.

Randal
Randal Koene: It wasnt like I was walking into labs, telling people I wanted to upload human minds to computers.

Koene was an affable and precisely eloquent man and his conversation was unusually engaging for someone so forbiddingly intelligent and who worked in so rarefied a field as computational neuroscience; so, in his company, I often found myself momentarily forgetting about the nearly unthinkable implications of the work he was doing, the profound metaphysical weirdness of the things he was explaining to me. Hed be talking about some tangential topic his happily cordial relationship with his ex-wife, say, or the cultural differences between European and American scientific communities and Id remember with a slow, uncanny suffusion of unease that his work, were it to yield the kind of results he is aiming for, would amount to the most significant event since the evolution of Homo sapiens. The odds seemed pretty long from where I was standing, but then again, I reminded myself, the history of science was in many ways an almanac of highly unlikely victories.

One evening in early spring, Koene drove down to San Francisco from the North Bay, where he lived and worked in a rented ranch house surrounded by rabbits, to meet me for dinner in a small Argentinian restaurant on Columbus Avenue. The faint trace of an accent turned out to be Dutch. Koene was born in Groningen and had spent most of his early childhood in Haarlem. His father was a particle physicist and there were frequent moves, including a two-year stint in Winnipeg, as he followed his work from one experimental nuclear facility to the next.

Now a boyish 43, he had lived in California only for the past five years, but had come to think of it as home, or the closest thing to home hed encountered in the course of a nomadic life. And much of this had to do with the culture of techno-progressivism that had spread outward from its concentrated origins in Silicon Valley and come to encompass the entire Bay Area, with its historically high turnover of radical ideas. It had been a while now, he said, since hed described his work to someone, only for them to react as though he were making a misjudged joke or simply to walk off mid-conversation.

In his early teens, Koene began to conceive of the major problem with the human brain in computational terms: it was not, like a computer, readable and rewritable. You couldnt get in there and enhance it, make it run more efficiently, like you could with lines of code. You couldnt just speed up a neuron like you could with a computer processor.

Around this time, he read Arthur C Clarkes The City and the Stars, a novel set a billion years from now, in which the enclosed city of Diaspar is ruled by a superintelligent Central Computer, which creates bodies for the citys posthuman citizens and stores their minds in its memory banks at the end of their lives, for purposes of reincarnation. Koene saw nothing in this idea of reducing human beings to data that seemed to him implausible and felt nothing in himself that prevented him from working to bring it about. His parents encouraged him in this peculiar interest and the scientific prospect of preserving human minds in hardware became a regular topic of dinnertime conversation.

Computational neuroscience, which drew its practitioners not from biology but from the fields of mathematics and physics, seemed to offer the most promising approach to the problem of mapping and uploading the mind. It wasnt until he began using the internet in the mid-1990s, though, that he discovered a loose community of people with an interest in the same area.

As a PhD student in computational neuroscience at Montreals McGill University, Koene was initially cautious about revealing the underlying motivation for his studies, for fear of being taken for a fantasist or an eccentric.

I didnt hide it, as such, he said, but it wasnt like I was walking into labs, telling people I wanted to upload human minds to computers either. Id work with people on some related area, like the encoding of memory, with a view to figuring out how that might fit into an overall road map for whole brain emulation.

Having worked for a while at Halcyon Molecular, a Silicon Valley gene-sequencing and nanotechnology startup funded by Peter Thiel, he decided to stay in the Bay Area and start his own nonprofit company aimed at advancing the cause to which hed long been dedicated: carboncopies

Koenes decision was rooted in the very reason he began pursuing that work in the first place: an anxious awareness of the small and diminishing store of days that remained to him. If hed gone the university route, hed have had to devote most of his time, at least until securing tenure, to projects that were at best tangentially relevant to his central enterprise. The path he had chosen was a difficult one for a scientist and he lived and worked from one small infusion of private funding to the next.

But Silicon Valleys culture of radical techno-optimism had been its own sustaining force for him, and a source of financial backing for a project that took its place within the wildly aspirational ethic of that cultural context. There were people there or thereabouts, wealthy and influential, for whom a future in which human minds might be uploaded to computers was one to be actively sought, a problem to be solved, disruptively innovated, by the application of money.

Transcendence
Brainchild of the movies: in Transcendence (2014), scientist Will Caster, played by Johnny Depp, uploads his mind to a computer program with dangerous results.

One such person was Dmitry Itskov, a 36-year-old Russian tech multimillionaire and founder of the 2045 Initiative, an organisationwhose stated aim was to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier, and extending life, including to the point of immortality. One of Itskovs projects was the creation of avatars artificial humanoid bodies that would be controlled through brain-computer interface, technologies that would be complementary with uploaded minds. He had funded Koenes work with Carboncopies and in 2013 they organised a conference in New York called Global Futures 2045, aimed, according to its promotional blurb, at the discussion of a new evolutionary strategy for humanity.

When we spoke, Koene was working with another tech entrepreneur named Bryan Johnson, who had sold his automated payment company to PayPal a couple of years back for $800m and who now controlled a venture capital concern called the OS Fund, which, I learned from its website, invests in entrepreneurs working towards quantum leap discoveries that promise to rewrite the operating systems of life. This language struck me as strange and unsettling in a way that revealed something crucial about the attitude toward human experience that was spreading outward from its Bay Area centre a cluster of software metaphors that had metastasised into a way of thinking about what it meant to be a human being.

And it was the sameessential metaphor that lay at the heart of Koenes project: the mind as a piece of software, an application running on the platform of flesh. When he used the term emulation, he was using it explicitly to evoke the sense in which a PCs operating system could be emulated on a Mac, as what he called platform independent code.

The relevant science for whole brain emulation is, as youd expect, hideously complicated, and its interpretation deeply ambiguous, but if I can risk a gross oversimplification here, I will say that it is possible to conceive of the idea as something like this: first, you scan the pertinent information in a persons brain the neurons, the endlessly ramifying connections between them, the information-processing activity of which consciousness is seen as a byproduct through whatever technology, or combination of technologies, becomes feasible first (nanobots, electron microscopy, etc). That scan then becomes a blueprint for the reconstruction of the subject brains neural networks, which is then converted into a computational model. Finally, you emulate all of this on a third-party non-flesh-based substrate: some kind of supercomputer or a humanoid machine designed to reproduce and extend the experience of embodiment something, perhaps, like Natasha Vita-Mores Primo Posthuman.

The whole point of substrate independence, as Koene pointed out to me whenever I asked him what it would be like to exist outside of a human body, and I asked him many times, in various ways was that it would be like no one thing, because there would be no one substrate, no one medium of being. This was the concept transhumanists referred to as morphological freedom the liberty to take any bodily form technology permits.

You can be anything you like, as an article about uploading in Extropy magazine put it in the mid-90s. You can be big or small; you can be lighter than air and fly; you can teleport and walk through walls. You can be a lion or an antelope, a frog or a fly, a tree, a pool, the coat of paint on a ceiling.

What really interested me about this idea was not how strange and far-fetched it seemed (though it ticked those boxes resolutely enough), but rather how fundamentally identifiable it was, how universal. When talking to Koene, I was mostly trying to get to grips with the feasibility of the project and with what it was he envisioned as a desirable outcome. But then we would part company I would hang up the call, or I would take my leave and start walking toward the nearest station and I would find myself feeling strangely affected by the whole project, strangely moved.

Because there was something, in the end, paradoxically and definitively human in this desire for liberation from human form. I found myself thinking often of WB Yeatss Sailing to Byzantium, in which the ageing poet writes of his burning to be free of the weakening body, the sickening heart to abandon the dying animal for the manmade and immortal form of a mechanical bird. Once out of nature, he writes, I shall never take/ My bodily form from any natural thing/ But such a form as Grecian goldsmiths make.

One evening, we were sitting outside a combination bar/laundromat/standup comedy venue in Folsom Street a place with the fortuitous name of BrainWash when I confessed that the idea of having my mind uploaded to some technological substrate was deeply unappealing to me, horrifying even. The effects of technology on my life, even now, were something about which I was profoundly ambivalent; for all I had gained in convenience and connectedness, I was increasingly aware of the extent to which my movements in the world were mediated and circumscribed by corporations whose only real interest was in reducing the lives of human beings to data, as a means to further reducing us to profit.

The content we consumed, the people with whom we had romantic encounters, the news we read about the outside world: all these movements were coming increasingly under the influence of unseen algorithms, the creations of these corporations, whose complicity with government, moreover, had come to seem like the great submerged narrative of our time. Given the world we were living in, where the fragile liberal ideal of the autonomous self was already receding like a half-remembered dream into the doubtful haze of history, wouldnt a radical fusion of ourselves with technology amount, in the end, to a final capitulation of the very idea of personhood?

Koene nodded again and took a sip of his beer.

Hearing you say that, he said, makes it clear that theres a major hurdle there for people. Im more comfortable than you are with the idea, but thats because Ive been exposed to it for so long that Ive just got used to it.

Dmitry
Russian billionaire Dmitry Itskov wants to create technologies enabling the transfer of an individuals personality to a more advanced nonbiological carrier. Photograph: Mary Altaffer/AP

In the weeks and months after I returned from San Francisco, I thought obsessively about the idea of whole brain emulation. One morning, I was at home in Dublin, suffering from both a head cold and a hangover. I lay there, idly considering hauling myself out of bed to join my wife and my son, who were in his bedroom next door enjoying a raucous game of Buckaroo. I realised that these conditions (head cold, hangover) had imposed upon me a regime of mild bodily estrangement. As often happens when Im feeling under the weather, I had a sense of myself as an irreducibly biological thing, an assemblage of flesh and blood and gristle. I felt myself to be an organism with blocked nasal passages, a bacteria-ravaged throat, a sorrowful ache deep within its skull, its cephalon. I was aware of my substrate, in short, because my substrate felt like shit.

And I was gripped by a sudden curiosity as to what, precisely, that substrate consisted of, as to what I myself happened, technically speaking, to be. I reached across for the phone on my nightstand and entered into Google the words What is the human… The first three autocomplete suggestions offered What is The Human Centipede about, and then: What is the human body made of, and then: What is the human condition.

It was the second question I wanted answered at this particular time, as perhaps a back door into the third. It turned out that I was 65% oxygen, which is to say that I was mostly air, mostly nothing. After that, I was composed of diminishing quantities of carbon and hydrogen, of calcium and sulphur and chlorine, and so on down the elemental table. I was also mildly surprised to learn that, like the iPhone I was extracting this information from, I also contained trace elements of copper and iron and silicon.

What a piece of work is a man, I thought, what a quintessence of dust.

Some minutes later, my wife entered the bedroom on her hands and knees, our son on her back, gripping the collar of her shirt tight in his little fists. She was making clip-clop noises as she crawled forward, he was laughing giddily and shouting: Dont buck! Dont buck!

With a loud neighing sound, she arched her back and sent him tumbling gently into a row of shoes by the wall and he screamed in delighted outrage, before climbing up again. None of this, I felt, could be rendered in code. None of this, I felt, could be run on any other substrate. Their beauty was bodily, in the most profound sense, in the saddest and most wonderful sense.

I never loved my wife and our little boy more, I realised, than when I thought of them as mammals. I dragged myself, my animal body, out of bed to join them.

To Be a Machine by Mark OConnell is published by Granta (12.99). To order a copy for 11.04 go to bookshop.theguardian.com or call 0330 333 6846. Free UK p&p over 10, online orders only. Phone orders min p&p of 1.99

Read more: https://www.theguardian.com/science/2017/mar/25/animal-life-is-over-machine-life-has-begun-road-to-immortality

Save

Technorati Tags: , , , , , , , ,

How The Insights Of The Large Hadron Collider Are Being Made Open To Everyone

The ConversationIf you visit the Large Hadron Collider (LHC) exhibition, now at the Queensland Museum, youll see the recreation of a moment when the scientist who saw the first results indicating discovery of the Higgs boson laments she cant yet tell anyone.

Its a transitory problem for her, lasting as long as it takes for the result to be thoroughly cross-checked. But it illustrates a key concept in science: its not enough to do it; it must be communicated.

Thats what is behind one of the lesser known initiatives of CERN (European Organization for Nuclear Research): an ambitious plan to make all its research in particle physics available to everyone, with a big global collaboration inspired by the way scientists came together to make discoveries at the LHC.

This initiative is called SCOAP, the Sponsoring Consortium for Open Access in Particle Physics Publishing, and is now about to enter its fourth year of operation. Its a worldwide collaboration of more than 3,000 libraries (including six in Australia), key funding agencies and research centres in 44 countries, together with three intergovernmental organisations.

It aims to make work previously only available to paying subscribers of academic journals freely and immediately available to everyone. In its first three years it has made more than 13,000 articles available.

Not only are these articles free for anyone to read, but because they are published under a Creative Commons attribution license (CCBY), they are also available for anyone to use in anyway they wish, such as to illustrate a talk, pass onto a class of school children, or feed to an artificial intelligence program to extract information from. And these usage rights are enshrined forever.

Open science

The concept of sharing research is not new in physics. Open access to research is now a growing worldwide initiative, including in Australasia. CERN, which runs the LHC, was also where the world wide web was invented in 1989 by Tim Berners-Lee, a British computer scientist at CERN.

The main purpose of the web was to enable researchers contributing to CERN from all over the world share documents, including scientific drafts, no matter what computer systems they were using.

Before the web, physicists had been sharing paper drafts by post for decades, so they were one of the first groups to really embrace the new online opportunities for sharing early research. Today, the pre-press site arxiv.org has more than a million free article drafts covering physics, mathematics, astronomy and more.

But, with such a specialised field, do these open access papers really matter? The short answer is yes. Downloads have doubled to journals participating in SCOAP.

With millions of open access articles now being downloaded across all specialities, there is enormous opportunity for new ideas and collaborations to spring from chance readership. This is an important trend: the concept of serendipity enabled by open access was explored in 2015 in an episode of ABC RNs Future Tense program.

Greater than the sum of the parts

Theres also a bigger picture to SCOAPs open access model. Not long ago, the research literature was fragmented. Individual papers and the connections between them were only as good as the physical library, with its paper journals, that academics had access to.

Now we can do searches in much less time than we spend thinking of the search question, and the results we are presented with are crucially dependent on how easily available the findings themselves are. And availability is not just a function of whether an article is free or not but whether it is truly open, i.e. connected and reusable.

One concept is whether research is FAIR, or Findable, Accessible, Interoperable and Reusable. In short, can anyone find, read, use and reuse the work?

The principle is most advanced for data, but in Australia work is ongoing to apply it to all research outputs. This approach was also proposed at the November 2016 meeting of the G20 Science, Technology and Innovation Ministers Meeting. Research findings that are not FAIR can, effectively, be invisible. Its a huge waste of millions of taxpayer dollars to fund research that wont be seen.

There is an even bigger picture that research and research publications have to fit into: that of science in society.

Across the world we see politicians challenging accepted scientific norms. Is the fact that most academic research remains available only to those who can pay to see it contributing to an acceptance of such misinformed views?

If one role for science is to inform public debate, then restricting access to that science will necessarily hinder any informed public debate. Although no one suggests that most readers of news sites will regularly want to delve into the details of papers in high energy physics, open access papers are 47% more likely to end up being cited in Wikipedia, which is a source that many non-scientists do turn to.

Even worse, work that is not available openly now may not even be available in perpetuity, something that is being discussed by scientists in the USA.

So in the same way that CERN itself is an example of the power of international collaboration to ask some of the fundamental scientific questions of our time, SCOAP provides a way to ensure that the answers, whatever they are, are available to everyone, forever.

Virginia Barbour, Executive Director, Australasian Open Access Strategy Group, Queensland University of Technology

This article was originally published on The Conversation. Read the original article.

Read more: http://www.iflscience.com/physics/how-the-insights-of-the-large-hadron-collider-are-being-made-open-to-everyone/

Technorati Tags: , , ,

How the Hitchhikers Guide can make the world a better place | Marcus ODair

Douglas Adamss sci-fi classic has inspired real-life tech innovations. So what else could we rip from its pages to aid our ailing society?

The Mobile World Congress, which takes place annually in Barcelona, is usually dominated by smartphones. Grabbing headlines this year, however, is the Pilot earpiece and its promise to instantly translate languages: a real-life version of the Babel Fish from The Hitchhikers Guide to the Galaxy.

The knife that toasts became a reality in 2015: its called the FurzoToasto.

It is not the first time that elements of science fiction from Douglas Adamss story have subsequently become science fact. The technology that allows the Hitchhikers Guide to be operated simply by brushing with ones fingers is now familiar from smartphones and tablets. The information the Guide stores, meanwhile, is user-generated, and constantly updated; the approach adopted by Wikipedia. And the sub-etha telecommunications network? Thats the internet, even if it doesnt yet extend across the entire Milky Way. Even the knife that toasts became a reality in 2015: its called the FurzoToasto. So which of Douglas Adamss other inventions should scientists bring to life?

Crisis inducer

Though it resembles a wristwatch, this product carries out a very different function: it convinces the wearer that a crisis is imminent. The severity of the crisis can be preselected by the user, but its always enough to get the adrenaline pumping. The ultimate cure for lethargy.

Joo Janta 200 Super-Chromatic Peril Sensitive Sunglasses

If the crisis is, on the other hand, all too real, these sunglasses offer a solution: at the first sign of danger, they turn opaque. OK, a relaxed attitude to danger might represent only a short-term solution but, for those few moments, ignorance is bliss. Could be useful in 2017.

Infinite Improbability Drive

The Infinite Improbability Drive, the key feature of the Heart of Gold spaceship, can carry out any conceivable action, providing that someone on board knows precisely how improbable that action is. It can, for instance, transform a pair of missiles into a sperm whale and a bowl of petunias, as well as facilitating interstellar travel. Just what we need in the Ministry of Defence.

The Infinite Improbability Drive in action

Total Perspective Vortex

Though powered by a piece of fairy cake, this machine is far from innocuous: in fact, in the Hitchhikers world, exposure to the Total Perspective Vortex is the ultimate form of torture, worse even than Vogon poetry. It does this by revealing to users their cosmic insignificance. Might be useful for reining in the egos of certain politicians.

Nutri-matic drinks dispenser

This vending machine wont issue a drink until it has analysed the users taste buds, metabolism and brain. Collecting all this data is pointless, however, as the machine always ultimately dispenses the same thing: a shoddy cup of tea. A properly bespoke drinks dispenser, however, sounds appealing and, in the era of big data and artificial intelligence, it might not be too far off. Mines a Pan Galactic Gargle Blaster.

Bistromathic Drive

Part of the appeal of Adamss story lies in its combination of sci-fi and the mundane: for all the planet-hopping, The Hitchhikers Guide also fits neatly into a line of English comedy running from Fawlty Towers to Peep Show. The Bistromathic Drive harnesses the unfathomable mathematics of restaurants in order to power a spaceship of extraordinary powers. Next time youre trying to split a bill between a large number of diners, few of whom are paying in cash, imagine you could use those very same mathematical quirks to travel across interstellar distances.

The Restaurant at the End of the Universe

You might be getting a sense, by now, that Douglas Adams liked restaurants but he never visited one 576 thousand million years in the future. His protagonists, however, enjoy the benefits of time travel, and so are able to visit to Milliways, billed as the Restaurant at the End of the Universe. At Milliways, diners watch the whole of creation destroyed, night after night: apocalypse as background entertainment. Theres no need to book (you can reserve a table retrospectively, when you return to your own time) and the meal is free too: just deposit a single penny in your own era, and the compound interest will take care of even the most exorbitant bill. An instant solution to the cost-of-living crisis.

Point of view gun

If you point it at someone and pull the trigger, he or she will instantly see things from your point of view.

As Stephen Fry, playing the Guide, tells us in the film version of The Hitchhikers Guide to the Galaxy, the point of view gun does precisely what its name suggests: if you point it at someone and pull the trigger, he or she will instantly see things from your point of view. Instant empathy. Something the past 12 months have been sorely lacking.

Read more: https://www.theguardian.com/commentisfree/2017/mar/06/hitchhikers-guide-to-the-galaxy-technology-sci-fi-books

Technorati Tags: , , , , , , , ,

How statistics lost their power and why we should fear what comes next | William Davies

The Long Read: The ability of statistics to accurately represent the world is declining. In its wake, a new age of big data controlled by private companies is taking over and putting democracy in peril

In theory, statistics should help settle arguments. They ought to provide stable reference points that everyone no matter what their politics can agree on. Yet in recent years, divergent levels of trust in statistics has become one of the key schisms that have opened up in western liberal democracies. Shortly before the November presidential election, a study in the US discovered that 68% of Trump supporters distrusted the economic data published by the federal government. In the UK, a research project by Cambridge University and YouGov looking at conspiracy theories discovered that 55% of the population believes that the government is hiding the truth about the number of immigrants living here.

Rather than diffusing controversy and polarisation, it seems as if statistics are actually stoking them. Antipathy to statistics has become one of the hallmarks of the populist right, with statisticians and economists chief among the various experts that were ostensibly rejected by voters in 2016. Not only are statistics viewed by many as untrustworthy, there appears to be something almost insulting or arrogant about them. Reducing social and economic issues to numerical aggregates and averages seems to violate some peoples sense of political decency.

Nowhere is this more vividly manifest than with immigration. The thinktank British Future has studied how best to win arguments in favour of immigration and multiculturalism. One of its main findings is that people often respond warmly to qualitative evidence, such as the stories of individual migrants and photographs of diverse communities. But statistics especially regarding alleged benefits of migration to Britains economy elicit quite the opposite reaction. People assume that the numbers are manipulated and dislike the elitism of resorting to quantitative evidence. Presented with official estimates of how many immigrants are in the country illegally, a common response is to scoff. Far from increasing support for immigration, British Future found, pointing to its positive effect on GDP can actually make people more hostile to it. GDP itself has come to seem like a Trojan horse for an elitist liberal agenda. Sensing this, politicians have now largely abandoned discussing immigration in economic terms.

All of this presents a serious challenge for liberal democracy. Put bluntly, the British government its officials, experts, advisers and many of its politicians does believe that immigration is on balance good for the economy. The British government did believe that Brexit was the wrong choice. The problem is that the government is now engaged in self-censorship, for fear of provoking people further.

This is an unwelcome dilemma. Either the state continues to make claims that it believes to be valid and is accused by sceptics of propaganda, or else, politicians and officials are confined to saying what feels plausible and intuitively true, but may ultimately be inaccurate. Either way, politics becomes mired in accusations of lies and cover-ups.

The declining authority of statistics and the experts who analyse them is at the heart of the crisis that has become known as post-truth politics. And in this uncertain new world, attitudes towards quantitative expertise have become increasingly divided. From one perspective, grounding politics in statistics is elitist, undemocratic and oblivious to peoples emotional investments in their community and nation. It is just one more way that privileged people in London, Washington DC or Brussels seek to impose their worldview on everybody else. From the opposite perspective, statistics are quite the opposite of elitist. They enable journalists, citizens and politicians to discuss society as a whole, not on the basis of anecdote, sentiment or prejudice, but in ways that can be validated. The alternative to quantitative expertise is less likely to be democracy than an unleashing of tabloid editors and demagogues to provide their own truth of what is going on across society.

Is there a way out of this polarisation? Must we simply choose between a politics of facts and one of emotions, or is there another way of looking at this situation?One way is to view statistics through the lens of their history.We need to try and see them for what they are: neither unquestionable truths nor elite conspiracies, but rather as tools designed to simplify the job of government, for better or worse. Viewed historically, we can see what a crucial role statistics have played in our understanding of nation states and their progress. This raises the alarming question of how if at all we will continue to have common ideas of society and collective progress, should statistics fall by the wayside.


In the second half of the 17th century, in the aftermath of prolonged and bloody conflicts, European rulers adopted an entirely new perspective on the task of government, focused upon demographic trends an approach made possible by the birth of modern statistics. Since ancient times, censuses had been used to track population size, but these were costly and laborious to carry out and focused on citizens who were considered politically important (property-owning men), rather than society as a whole. Statistics offered something quite different, transforming the nature of politics in the process.

Statistics were designed to give an understanding of a population in its entirety,rather than simply to pinpoint strategically valuable sources of power and wealth. In the early days, this didnt always involve producing numbers. In Germany, for example (from where we get the term Statistik) the challenge was to map disparate customs, institutions and laws across an empire of hundreds of micro-states. What characterised this knowledge as statistical was its holistic nature: it aimed to produce a picture of the nation as a whole. Statistics would do for populations what cartography did for territory.

Equally significant was the inspiration of the natural sciences. Thanks to standardised measures and mathematical techniques, statistical knowledge could be presented as objective, in much the same way as astronomy. Pioneering English demographers such as William Petty and John Graunt adapted mathematical techniques to estimate population changes, for which they were hired by Oliver Cromwell and Charles II.

The emergence in the late 17th century of government advisers claiming scientific authority, rather than political or military acumen, represents the origins of the expert culture now so reviled by populists. These path-breaking individuals were neither pure scholars nor government officials, but hovered somewhere between the two. They were enthusiastic amateurs who offered a new way of thinking about populations that privileged aggregates and objective facts. Thanks to their mathematical prowess, they believed they could calculate what would otherwise require a vast census to discover.

There was initially only one client for this type of expertise, and the clue is in the word statistics. Only centralised nation states had the capacity to collect data across large populations in a standardised fashion and only states had any need for such data in the first place. Over the second half of the 18th century, European states began to collect more statistics of the sort that would appear familiar to us today. Casting an eye over national populations, states became focused upon a range of quantities: births, deaths, baptisms, marriages, harvests, imports, exports, price fluctuations. Things that would previously have been registered locally and variously at parish level became aggregated at a national level.

New techniques were developed to represent these indicators, which exploited both the vertical and horizontal dimensions of the page, laying out data in matrices and tables, just as merchants had done with the development of standardised book-keeping techniques in the late 15th century. Organising numbers into rows and columns offered a powerful new way of displaying the attributes of a given society. Large, complex issues could now be surveyed simply by scanning the data laid out geometrically across a single page.

These innovations carried extraordinary potential for governments. By simplifying diverse populations down to specific indicators, and displaying them in suitable tables, governments could circumvent the need to acquire broader detailed local and historical insight. Of course, viewed from a different perspective, this blindness to local cultural variability is precisely what makes statistics vulgar and potentially offensive. Regardless of whether a given nation had any common cultural identity, statisticians would assume some standard uniformity or, some might argue, impose that uniformity upon it.

Not every aspect of a given population can be captured by statistics. There is always an implicit choice in what is included and what is excluded, and this choice can become a political issue in its own right. The fact that GDP only captures the value of paid work, thereby excluding the work traditionally done by women in the domestic sphere, has made it a target of feminist critique since the 1960s. In France, it has been illegal to collect census data on ethnicity since 1978, on the basis that such data could be used for racist political purposes. (This has the side-effect of making systemic racism in the labour market much harder to quantify.)

Despite these criticisms, the aspiration to depict a society in its entirety, and to do so in an objective fashion, has meant that various progressive ideals have been attached to statistics. The image of statistics as a dispassionate science of society is only one part of the story. The other part is about how powerful political ideals became invested in these techniques: ideals of evidence-based policy, rationality, progress and nationhood grounded in facts, rather than in romanticised stories.


Since the high-point of the Enlightenmentin the late 18th century, liberals and republicans have invested great hope that national measurement frameworks could produce a more rational politics, organised around demonstrable improvements in social and economic life. The great theorist of nationalism, Benedict Anderson, famously described nations as imagined communities,but statistics offer the promise of anchoring this imagination in something tangible. Equally, they promise to reveal what historical path the nation is on: what kind of progress is occurring? How rapidly? For Enlightenment liberals, who saw nations as moving in a single historical direction, this question was crucial.

The potential of statistics to reveal the state of the nation was seized in post-revolutionary France. The Jacobin state set about imposing a whole new framework of national measurement and national data collection. The worlds first official bureau of statistics was opened in Paris in 1800. Uniformity of data collection, overseen by a centralised cadre of highly educated experts, was an integral part of the ideal of a centrally governed republic, which sought to establish a unified, egalitarian society.

From the Enlightenment onwards, statistics played an increasingly important role in the public sphere, informing debate in the media, providing social movements with evidence they could use. Over time, the production and analysis of such data became less dominated by the state. Academic social scientists began to analyse data for their own purposes, often entirely unconnected to government policy goals. By the late 19th century, reformers such as Charles Booth in London and WEB Du Bois in Philadelphia were conducting their own surveys to understand urban poverty.

Illustration
Illustration by Guardian Design

To recognise how statistics have been entangled in notions of national progress, consider the case of GDP. GDP is an estimate of the sum total of a nations consumer spending, government spending, investments and trade balance (exports minus imports), which is represented in a single number. This is fiendishly difficult to get right, and efforts to calculate this figure began, like so many mathematical techniques, as a matter of marginal, somewhat nerdish interest during the 1930s. It was only elevated to a matter of national political urgency by the second world war, when governments needed to know whether the national population was producing enough to keep up the war effort. In the decades that followed, this single indicator, though never without its critics, took on a hallowed political status, as the ultimate barometer of a governments competence. Whether GDP is rising or falling is now virtually a proxy for whether society is moving forwards or backwards.

Or take the example of opinion polling, an early instance of statistical innovation occurring in the private sector. During the 1920s, statisticians developed methods for identifying a representative sample of survey respondents, so as to glean the attitudes of the public as a whole. This breakthrough, which was first seized upon by market researchers, soon led to the birth of the opinion polling. This new industry immediately became the object of public and political fascination, as the media reported on what this new science told us about what women or Americans or manual labourers thought about the world.

Nowadays, the flaws of polling are endlessly picked apart. But this is partly due to the tremendous hopes that have been invested in polling since its origins. It is only to the extent that we believe in mass democracy that we are so fascinated or concerned by what the public thinks. But for the most part it is thanks to statistics, and not to democratic institutions as such, that we can know what the public thinks about specific issues. We underestimate how much of our sense of the public interest is rooted in expert calculation, as opposed to democratic institutions.

As indicators of health, prosperity, equality, opinion and quality of life have come to tell us who we are collectively and whether things are getting better or worse, politicians have leaned heavily on statistics to buttress their authority. Often, they lean too heavily, stretching evidence too far, interpreting data too loosely, to serve their cause. But that is an inevitable hazard of the prevalence of numbers in public life, and need not necessarily trigger the type of wholehearted rejections of expertise that we have witnessed recently.

In many ways, the contemporary populist attack on experts is born out of the same resentment as the attack on elected representatives. In talking of society as a whole, in seeking to govern the economy as a whole, both politicians and technocrats are believed to have lost touch with how it feels to be a single citizen in particular. Both statisticians and politicians have fallen into the trap of seeing like a state, to use a phrase from the anarchist political thinker James C Scott. Speaking scientifically about the nation for instance in terms of macroeconomics is an insult to those who would prefer to rely on memory and narrative for their sense of nationhood, and are sick of being told that their imagined community does not exist.

On the other hand, statistics (together with elected representatives) performed an adequate job of supporting a credible public discourse for decades if not centuries. What changed?


The crisis of statistics is not quite as sudden as it might seem. For roughly 450 years, the great achievement of statisticians has been to reduce the complexity and fluidity of national populations into manageable, comprehensible facts and figures. Yet in recent decades, the world has changed dramatically, thanks to the cultural politics that emerged in the 1960s and the reshaping of the global economy that began soon after. It is not clear that the statisticians have always kept pace with these changes. Traditional forms of statistical classification and definition are coming under strain from more fluid identities, attitudes and economic pathways. Efforts to represent demographic, social and economic changes in terms of simple, well-recognised indicators are losing legitimacy.

Consider the changing political and economic geography of nation states over the past 40 years. The statistics that dominate political debate are largely national in character: poverty levels, unemployment, GDP, net migration. But the geography of capitalism has been pulling in somewhat different directions. Plainly globalisation has not rendered geography irrelevant. In many cases it has made the location of economic activity far more important, exacerbating the inequality between successful locations (such as London or San Francisco) and less successful locations (such as north-east England or the US rust belt). The key geographic units involved are no longer nation states. Rather, it is cities, regions or individual urban neighbourhoods that are rising and falling.

The Enlightenment ideal of the nation as a single community, bound together by a common measurement framework, is harder and harder to sustain. If you live in one of the towns in the Welsh valleys that was once dependent on steel manufacturing or mining for jobs, politicians talking of how the economy is doing well are likely to breed additional resentment. From that standpoint, the term GDP fails to capture anything meaningful or credible.

When macroeconomics is used to make a political argument, this implies that the losses in one part of the country are offset by gains somewhere else. Headline-grabbing national indicators, such as GDP and inflation, conceal all sorts of localised gains and losses that are less commonly discussed by national politicians. Immigration may be good for the economy overall, but this does not mean that there are no local costs at all. So when politicians use national indicators to make their case, they implicitly assume some spirit of patriotic mutual sacrifice on the part of voters: you might be the loser on this occasion, but next time you might be the beneficiary. But what if the tables are never turned? What if the same city or region wins over and over again, while others always lose? On what principle of give and take is that justified?

In Europe, the currency union has exacerbated this problem. The indicators that matter to the European Central Bank (ECB), for example, are those representing half a billion people. The ECB is concerned with the inflation or unemployment rate across the eurozone as if it were a single homogeneous territory, at the same time as the economic fate of European citizens is splintering in different directions, depending on which region, city or neighbourhood they happen to live in. Official knowledge becomes ever more abstracted from lived experience, until that knowledge simply ceases to be relevant or credible.

The privileging of the nation as the natural scale of analysis is one of the inbuilt biases of statistics that years of economic change has eaten away at. Another inbuilt bias that is coming under increasing strain is classification. Part of the job of statisticians is to classify people by putting them into a range of boxes that the statistician has created: employed or unemployed, married or unmarried, pro-Europe or anti-Europe. So long as people can be placed into categories in this way, it becomes possible to discern how far a given classification extends across the population.

This can involve somewhat reductive choices. To count as unemployed, for example, a person has to report to a survey that they are involuntarily out of work, even if it may be more complicated than that in reality. Many people move in and out of work all the time, for reasons that might have as much to do with health and family needs as labour market conditions. But thanks to this simplification, it becomes possible to identify the rate of unemployment across the population as a whole.

Heres a problem, though. What if many of the defining questions of our age are not answerable in terms of the extent of people encompassed, but the intensity with which people are affected? Unemployment is one example. The fact that Britain got through the Great Recession of 2008-13 without unemployment rising substantially is generally viewed as a positive achievement. But the focus on unemployment masked the rise of underemployment, that is, people not getting a sufficient amount of work or being employed at a level below that which they are qualified for. This currently accounts for around 6% of the employed labour force. Then there is the rise of the self-employed workforce, where the divide between employed and involuntarily unemployed makes little sense.

This is not a criticism of bodies such as the Office for National Statistics (ONS), which does now produce data on underemployment. But so long as politicians continue to deflect criticism by pointing to the unemployment rate, the experiences of those struggling to get enough work or to live on their wages go unrepresented in public debate. It wouldnt be all that surprising if these same people became suspicious of policy experts and the use of statistics in political debate, given the mismatch between what politicians say about the labour market and the lived reality.

The rise of identity politics since the 1960s has put additional strain on such systems of classification. Statistical data is only credible if people will accept the limited range of demographic categories that are on offer, which are selected by the expert not the respondent. But where identity becomes a political issue, people demand to define themselves on their own terms, where gender, sexuality, race or class is concerned.

Opinion polling may be suffering for similar reasons. Polls have traditionally captured peoples attitudes and preferences, on the reasonable assumption that people will behave accordingly. But in an age of declining political participation, it is not enough simply to know which box someone would prefer to put an X in. One also needs to know whether they feel strongly enough about doing so to bother. And when it comes to capturing such fluctuations in emotional intensity, polling is a clumsy tool.

Statistics have faced criticism regularly over their long history. The challenges that identity politics and globalisation present to them are not new either. Why then do the events of the past year feel quite so damaging to the ideal of quantitative expertise and its role in political debate?


In recent years, a new way of quantifyingand visualising populations has emerged that potentially pushes statistics to the margins, ushering in a different era altogether. Statistics, collected and compiled by technical experts, are giving way to data that accumulates by default, as a consequence of sweeping digitisation. Traditionally, statisticians have known which questions they wanted to ask regarding which population, then set out to answer them. By contrast, data is automatically produced whenever we swipe a loyalty card, comment on Facebook or search for something on Google. As our cities, cars, homes and household objects become digitally connected, the amount of data we leave in our trail will grow even greater. In this new world, data is captured first and research questions come later.

In the long term, the implications of this will probably be as profound as the invention of statistics was in the late 17th century. The rise of big data provides far greater opportunities for quantitative analysis than any amount of polling or statistical modelling. But it is not just the quantity of data that is different. It represents an entirely different type of knowledge, accompanied by a new mode of expertise.

First, there is no fixed scale of analysis (such as the nation) nor any settled categories (such as unemployed). These vast new data sets can be mined in search of patterns, trends, correlations and emergent moods. It becomes a way of tracking the identities that people bestow upon themselves (such as #ImwithCorbyn or entrepreneur) rather than imposing classifications upon them. This is a form of aggregation suitable to a more fluid political age, in which not everything can be reliably referred back to some Enlightenment ideal of the nation state as guardian of the public interest.

Second, the majority of us are entirely oblivious to what all this data says about us, either individually or collectively. There is no equivalent of an Office for National Statistics for commercially collected big data. We live in an age in which our feelings, identities and affiliations can be tracked and analysed with unprecedented speed and sensitivity but there is nothing that anchors this new capacity in the public interest or public debate. There are data analysts who work for Google and Facebook, but they are not experts of the sort who generate statistics and who are now so widely condemned. The anonymity and secrecy of the new analysts potentially makes them far more politically powerful than any social scientist.

A company such as Facebook has the capacity to carry quantitative social science on hundreds of millions of people, at very low cost. But it has very little incentive to reveal the results. In 2014, when Facebook researchers published results of a study of emotional contagion that they had carried out on their users in which they altered news feeds to see how it affected the content that users then shared in response there was an outcry that people were being unwittingly experimented on. So, from Facebooks point of view, why go to all the hassle of publishing? Why not just do the study and keep quiet?


What is most politically significantabout this shift from a logic of statistics to one of data is how comfortably it sits with the rise of populism. Populist leaders can heap scorn upon traditional experts, such as economists and pollsters, while trusting in a different form of numerical analysis altogether. Such politicians rely on a new, less visible elite, who seek out patterns from vast data banks, but rarely make any public pronouncements, let alone publish any evidence. These data analysts are often physicists or mathematicians, whose skills are not developed for the study of society at all. This, for example, is the worldview propagated by Dominic Cummings, former adviser to Michael Gove and campaign director of Vote Leave. Physics, mathematics and computer science are domains in which there are real experts, unlike macro-economic forecasting, Cummings has argued.

Figures close to Donald Trump, such as his chief strategist Steve Bannon and the Silicon Valley billionaire Peter Thiel, are closely acquainted with cutting-edge data analytics techniques, via companies such as Cambridge Analytica, on whose board Bannon sits. During the presidential election campaign, Cambridge Analytica drew on various data sources to develop psychological profiles of millions of Americans, which it then used to help Trump target voters with tailored messaging.

This ability to develop and refine psychological insights across large populations is one of the most innovative and controversial features of the new data analysis. As techniques of sentiment analysis, which detect the mood of large numbers of people by tracking indicators such as word usage on social media, become incorporated into political campaigns, the emotional allure of figures such as Trump will become amenable to scientific scrutiny. In a world where the political feelings of the general public are becoming this traceable, who needs pollsters?

Few social findings arising from this kind of data analytics ever end up in the public domain. This means that it does very little to help anchor political narrative in any shared reality. With the authority of statistics waning, and nothing stepping into the public sphere to replace it, people can live in whatever imagined community they feel most aligned to and willing to believe in. Where statistics can be used to correct faulty claims about the economy or society or population, in an age of data analytics there are few mechanisms to prevent people from giving way to their instinctive reactions or emotional prejudices. On the contrary, companies such as Cambridge Analytica treat those feelings as things to be tracked.

But even if there were an Office for Data Analytics, acting on behalf of the public and government as the ONS does, it is not clear that it would offer the kind of neutral perspective that liberals today are struggling to defend. The new apparatus of number-crunching is well suited to detecting trends, sensing the mood and spotting things as they bubble up. It serves campaign managers and marketers very well. It is less well suited to making the kinds of unambiguous, objective, potentially consensus-forming claims about society that statisticians and economists are paid for.

In this new technical and political climate, it will fall to the new digital elite to identify the facts, projections and truth amid the rushing stream of data that results. Whether indicators such as GDP and unemployment continue to carry political clout remains to be seen, but if they dont, it wont necessarily herald the end of experts, less still the end of truth. The question to be taken more seriously, now that numbers are being constantly generated behind our backs and beyond our knowledge, is where the crisis of statistics leaves representative democracy.

On the one hand, it is worth recognising the capacity of long-standing political institutions to fight back. Just as sharing economy platforms such as Uber and Airbnb have recently been thwarted by legal rulings (Uber being compelled to recognise drivers as employees, Airbnb being banned altogether by some municipal authorities), privacy and human rights law represents a potential obstacle to the extension of data analytics. What is less clear is how the benefits of digital analytics might ever be offered to the public, in the way that many statistical data sets are. Bodies such as the Open Data Institute, co-founded by Tim Berners-Lee, campaign to make data publicly available, but have little leverage over the corporations where so much of our data now accumulates. Statistics began life as a tool through which the state could view society, but gradually developed into something that academics, civic reformers and businesses had a stake in. But for many data analytics firms, secrecy surrounding methods and sources of data is a competitive advantage that they will not give up voluntarily.

A post-statistical society is a potentially frightening proposition, not because it would lack any forms of truth or expertise altogether, but because it would drastically privatise them. Statistics are one of many pillars of liberalism, indeed of Enlightenment. The experts who produce and use them have become painted as arrogant and oblivious to the emotional and local dimensions of politics. No doubt there are ways in which data collection could be adapted to reflect lived experiences better. But the battle that will need to be waged in the long term is not between an elite-led politics of facts versus a populist politics of feeling. It is between those still committed to public knowledge and public argument and those who profit from the ongoing disintegration of those things.

Follow the Long Read on Twitter at @gdnlongread, or sign up to the long read weekly email here.

Read more: https://www.theguardian.com/politics/2017/jan/19/crisis-of-statistics-big-data-democracy

Technorati Tags: , , , , ,

How The Insights Of The Large Hadron Collider Are Being Made Open To Everyone

The ConversationIf you visit the Large Hadron Collider (LHC) exhibition, now at the Queensland Museum, youll see the recreation of a moment when the scientist who saw the first results indicating discovery of the Higgs boson laments she cant yet tell anyone.

Its a transitory problem for her, lasting as long as it takes for the result to be thoroughly cross-checked. But it illustrates a key concept in science: its not enough to do it; it must be communicated.

Thats what is behind one of the lesser known initiatives of CERN (European Organization for Nuclear Research): an ambitious plan to make all its research in particle physics available to everyone, with a big global collaboration inspired by the way scientists came together to make discoveries at the LHC.

This initiative is called SCOAP, the Sponsoring Consortium for Open Access in Particle Physics Publishing, and is now about to enter its fourth year of operation. Its a worldwide collaboration of more than 3,000 libraries (including six in Australia), key funding agencies and research centres in 44 countries, together with three intergovernmental organisations.

It aims to make work previously only available to paying subscribers of academic journals freely and immediately available to everyone. In its first three years it has made more than 13,000 articles available.

Not only are these articles free for anyone to read, but because they are published under a Creative Commons attribution license (CCBY), they are also available for anyone to use in anyway they wish, such as to illustrate a talk, pass onto a class of school children, or feed to an artificial intelligence program to extract information from. And these usage rights are enshrined forever.

Open science

The concept of sharing research is not new in physics. Open access to research is now a growing worldwide initiative, including in Australasia. CERN, which runs the LHC, was also where the world wide web was invented in 1989 by Tim Berners-Lee, a British computer scientist at CERN.

The main purpose of the web was to enable researchers contributing to CERN from all over the world share documents, including scientific drafts, no matter what computer systems they were using.

Before the web, physicists had been sharing paper drafts by post for decades, so they were one of the first groups to really embrace the new online opportunities for sharing early research. Today, the pre-press site arxiv.org has more than a million free article drafts covering physics, mathematics, astronomy and more.

But, with such a specialised field, do these open access papers really matter? The short answer is yes. Downloads have doubled to journals participating in SCOAP.

With millions of open access articles now being downloaded across all specialities, there is enormous opportunity for new ideas and collaborations to spring from chance readership. This is an important trend: the concept of serendipity enabled by open access was explored in 2015 in an episode of ABC RNs Future Tense program.

Greater than the sum of the parts

Theres also a bigger picture to SCOAPs open access model. Not long ago, the research literature was fragmented. Individual papers and the connections between them were only as good as the physical library, with its paper journals, that academics had access to.

Now we can do searches in much less time than we spend thinking of the search question, and the results we are presented with are crucially dependent on how easily available the findings themselves are. And availability is not just a function of whether an article is free or not but whether it is truly open, i.e. connected and reusable.

One concept is whether research is FAIR, or Findable, Accessible, Interoperable and Reusable. In short, can anyone find, read, use and reuse the work?

The principle is most advanced for data, but in Australia work is ongoing to apply it to all research outputs. This approach was also proposed at the November 2016 meeting of the G20 Science, Technology and Innovation Ministers Meeting. Research findings that are not FAIR can, effectively, be invisible. Its a huge waste of millions of taxpayer dollars to fund research that wont be seen.

There is an even bigger picture that research and research publications have to fit into: that of science in society.

Across the world we see politicians challenging accepted scientific norms. Is the fact that most academic research remains available only to those who can pay to see it contributing to an acceptance of such misinformed views?

If one role for science is to inform public debate, then restricting access to that science will necessarily hinder any informed public debate. Although no one suggests that most readers of news sites will regularly want to delve into the details of papers in high energy physics, open access papers are 47% more likely to end up being cited in Wikipedia, which is a source that many non-scientists do turn to.

Even worse, work that is not available openly now may not even be available in perpetuity, something that is being discussed by scientists in the USA.

So in the same way that CERN itself is an example of the power of international collaboration to ask some of the fundamental scientific questions of our time, SCOAP provides a way to ensure that the answers, whatever they are, are available to everyone, forever.

Virginia Barbour, Executive Director, Australasian Open Access Strategy Group, Queensland University of Technology

This article was originally published on The Conversation. Read the original article.

Read more: http://www.iflscience.com/physics/how-the-insights-of-the-large-hadron-collider-are-being-made-open-to-everyone/

Technorati Tags: , , ,