Willerslevs vilde logo-sviner - Bureaubiz pandora samling

Bureaubiz

scaricare pandora appauthor">
Af Thomas Sehested, Managing Director

Nationalmuseets nye direktør, Rane Willerslev, bryder sig ikke om museets logo: "Det er simpelthen så røvsygt". Thomas Sehested, direktør i Stereo Associates, var med på holdet, der i sin tid udviklede logoet og den visuelle identitet. Her er hans svar på kritikken.

Antropologen Rane Willerslev har siden den 1. juli 2017 været direktør for Nationalmuseet, og han har med sædvanlig energi og engagement kastet sig over at gøre samlingerne vedkommende og engagerende for dette årtusindes besøgende. Men logoet står i vejen for det nye – det er “simpelthen så røvsygt”, er han citeret for.

Har han ret? Tager han fejl? Og hvis der er noget, der skal ændres i konglomeratet Nationalmuseet – begynder forandringsprocessen så med et logo?

Rane Willerslev og hans tvillingebror Eske Willerslev er her, der og allevegne. Med tv-serier, bøger, radiooptræden og stribevis af interviews har vi fået en ny type rollemodeller, når det gælder forskning og formidling. Det er dejligt, det er fresh, for det sker ikke så tit, at dyb faglighed og vilje til at kaste sig ind i samtiden på den måde er med til at sætte en dagsorden. Så langt, så godt.

Men er Rane Willerslev kommet godt fra start? Har han sat sig ind i tingene? Willerslev er ikke alene forsker og debattør, han er nu også min og din direktør for den fremmeste kulturhistoriske institution. Nu taler han med al den autoritet, der ligger i stillingen, så han skal tages alvorligt.


Rane (th) og Eske (tv) Willerslev er enæggede tvillinger og eliteforskere. Scanpix: Ditte Valente.

Ny kost, ny tid
Vi kender det alle sammen. En ny chef er også lig med en ny kost, der skal feje det gamle skidt ud af krogene. Det nye skal legitimeres. Således også med Willerslev. Logoet skal væk, for det signalerer offentlighed på den forkerte måde.

Men gør det nu også det? Og hvad er det for en betydning, som han lægger i mærket? Og hvis der er noget galt med Nationalmuseet – er det så logoet, der er det rigtige sted at starte? Skulle et nyt logo ikke netop være resultatet af overvejelserne om et nyt Nationalmuseum?

Hvad siger teorierne om forandringsledelse? Starter man med det ydre, altså med logoet, eller skal logoet (og alt hvad der deraf følger af visuelle guidelines) ikke netop være resultatet af en strategi og videre tænkning om, hvad det hele skal gøre godt for?

Integritet og antihelt
Willerslev forestiller sig, at et nyt logo for Prinsens Palæ f.eks. har Solvognen som symbol. Han ser ifølge artiklen i Berlingske også Prinsens Palæ som en slags samlingspunkt for organisationen – det er Dødsstjernen med Willerslev som Darth Vader. Ja, det sagde han faktisk.

Jeg har altid mere set Willerslev som Han Solo fra de tidlige Star Wars-film (ikke de senere, hvor han bliver en sur mand). Han Solo var oprindelig sådan en, der tit sagde det rigtige, men måske også mindre høfligt og korrekt. En outsider-helt, som godt kunne tillade sig at træde lidt ved siden af, for vi vidste, at integriteten var intakt.

Måske skal Willerslev til at øve sig i at være mere Master Yoda-agtig? Darth Vader rimer dårligt på hans øvrige udsagn, og Mr. Vader var nok ikke den figur, som kulturministeren tænkte på, da hun ansatte Willerslev.

Signal til organisationen
Selvom der sikkert er nogle organisationskonsulenter, der kan få meget ud af det med Vader – er det så et godt signal at sende til medarbejderne i den vidtforgrenede organisation?

Og er det i virkeligheden så anderledes og nyt, end det var før Willerslevs tiltrædelse, hvor det trelagede N-logo netop bliver brugt i alle dele af institutionen som en slags garant for og markør af kvalitet, viden og måske oplevelse?

Sluppen Ruth og de andre
Willerslevs tænkning om Nationalmuseets monolitiske brandstrategi, hvor Dødsstjernen er et epicenter i Nationalmuseet, står også i modsætning til, hvad han fortæller om at give de forskellige enheder større autonomi og selvstændighed i forhold til corporate HQ ved Frederiksholms Kanal.

Der er jo stor forskel på f.eks. Hangar 46 i Værløse og Sluppen Ruth, der begge er en del af organisationen. Det forstås.

Mærsk eller P&G
Skal vi tage Willerslev på ordet, så skal vi både forstå Nationalmuseets brand som en slags Mærsk, der med samme logo, farve og typografi kan rumme divisioner som Oil, Line og Drilling, men på den anden side er det også at sammenligne med Procter & Gamble, der ejer produkter, der er stærkere end hovedbrandet.

Hvem ved f.eks. at Braun, Libresse og Lacoste er ejet af netop P&G? Der bliver noget at se til for designbureauet, der skal udtænke Nationalmuseets nye visuelle identitet.

Museum og logo
Hvis vi kigger lidt videre ud over de nationalmuseale sammenhænge, så har det gennem mange år været en tendens, at visuel identitet, herunder logoet, ikke fortæller, hvad der rent faktisk er på museet.

Det lyder måske mærkeligt, men der er jo en grund til at British Museum ikke har Rosetta-stenen som symbol, at Louvre ikke skilter med Mona Lisa som ikon, og at Glyptoteket ikke har hverken mumier eller impressionister som visuel markør.

Trends i museumsbranding
Store, favnende museer er så meget andet end de enkelte genstande. Faktisk er museerne også eksponenter for mere end summen af samlinger. Museerne har jo en position, de skal kommunikere; de har en særlig måde, historie, metode eller tilgang, som skal gennemsyre tænkningen – og designet.

Og derudover synes jeg, det er tvivlsomt, at Solvognen skulle kunne bære at blive brugt som samlende symbol. Fungerer det som overordnet mærke for samtlige udstillinger, herunder om vikinger eller de hvide busser?

Er Solvognen et godt og samlende udtryk for de visioner, som Willerslev har for formidlingen af Danmarks kultur og historie?

Hvorfor ikke hashboden fra Christiania – meget apropos hans optræden i Radio24syv, hvor der blev røget en bønne – eller et papuansk penisfutteral fra den etnologiske samling, der er med til at gøre Prinsens Palæ særlig?

Fremad – ikke bagud
Det er bagudskuende at tænke i genstande som symbolik. Og ja, det er rigtigt, der er museer, som har genstande som ikon. Tag f.eks. Museet for Anden Verdenskrig i Gdansk, der har stiliserede bomber, eller Imperial War Museum, hvis logo leder tanken hen på faldende bomber og ødelæggelse. Men her er der også tale om afgrænsede tematiske samlinger.


Logoet for Museet for Anden Verdenskrig i Gdansk.

Ting eller historie?
I lidt bredere perspektiv skygger genstandene, objekterne, også for muligheden for at fortælle en større historie om Nationalmuseet – for det er andet og mere end tingsamlinger.

De besøgende ved godt, at der er ting på et museum. Derimod vil man gerne have, at publikum gør sig umage med at sætte objekterne i spil, i perspektiv og kontekst. Fornøjelsen ved Solvognen er ret begrænset, hvis vi ikke hører om kulturen og menneskene, der skabte den: Hvordan boede de? Hvad spiste de? Hvad tænkte de på? Hvor tit havde de sex?

Dette er altsammen noget, som jeg bilder mig ind, Willerslev faktisk kan skrive under på, men som hans tænkning om visuel identitet åbenlyst ikke er synkroniseret med. Jeg tror, han vil noget, men faktisk siger han noget andet.

Hvad er godt museumsdesign?
Se eksempelvis på Statens Museum for Kunst og deres logo. Her er den gennemgående idé, at rammen ikke symboliserer et konkret maleri, men at SMK er et rammesættende sted. Der er jo så mange andre kunstformer og genrer end netop maleriet på SMK. Derfor siger logoet (og rammen), at SMK kuraterer, udvælger, fortæller, sætter i scene – når det handler om kunst.

 

SMK har sat sig for at gøre os klogere på kunst. Logoet fortæller om, hvordan museet arbejder, hvad jeg kan forvente som gæst.

Se også på M/S Museet for Søfart. Her kunne en fin tegning af en skonnert og nogle flag som garniture være et oplagt valg til et logo, hvis det var lavet for 20 år siden. Men M/S favner jo netop også det nutidige og fremtidige – og derfor er identiten bygget op med inspiration fra moderne skibsfart.

Gæsterne ved jo godt, at de nok skal få fortalt om gamle dage, men de ved måske ikke, at nutiden er inviteret indenfor; det er så at sige også et branchemuseum og et succesfuldt et af slagsen, hvor design og arkitektur spiller fint sammen.

Før, nu og aldrig
Ideen bag Nationalmuseets nuværende logo er netop at kommunikere, hvad Nationalmuseets rolle er i samfundet – på en enkel og forsimplet måde.

Da det blev lavet i 2013, var det vigtigt for ledelsen, at der kom styr på familien i Nationalmuseet – der skulle være genkendelighed på tværs af de 20 enheder, men samtidig skulle der også være stor lokal frihed, så længe typografi og N-logoet blev brugt.

Tanken bag N’et er, at de tre lag i snæver forstand symboliserer fortid, nutid, fremtid. I den kontekst er Nationalmuseets fornemste rolle at arbejde med fortiden til gavn for nutiden med sigte på fremtiden. Sagt på en anden måde: Nationalmuseet skal fremlægge fortiden for nutiden, så vi kan være klogere på fremtiden. Intet mindre.

I morgen bliver i dag
Af disse tre tidsperspektiver er fortid og fremtid de eneste konstanter; den ene har været der, og den anden kommer på et tidspunkt. Sådan vil det altid være. Det er nutiden, der ubønhørligt flytter sig – og i den sammenhæng er det et kerneelement for de videnskaber, der beskæftiger sig med fortiden, at vi altid ser det, der var, med nutidens briller: I morgen er i dag i går.

Hvad vi i 2017 synes er vigtigt at vide om stenaldermennesker, kan meget vel vise sig at være noget andet om 10 år, fordi vores samfund har ændret sig, eller fordi teknologien er anderledes eller noget tredje. Det er jo noget brormand Eske Willerslev kan tale med om – hans forskning har genskrevet verdenshistorien. Vi ser anderledes på aboriginere i dag, end vi gjorde for 10 år siden.

Konkurrencen er stor
Det gælder også om, at Nationalmuseet skal stå distancen i forhold til de andre tilbud, som storbyen frister med. De nære konkurrenter er selvfølgelig andre museer, men i bredere forstand er det også Tivoli, biografen, kanalrundfarten og sofaen derhjemme.

Jo klarere og stærkere fortælling om Nationalmuseet, man kan præstere, jo bedre og klarere bliver signalgivningen om museets rolle og betydning for gæster og samfund. Den rolle kan Solvognen eller hashboden – uanset deres historiske og ikonografiske værdi – ikke håndtere.

Rædselsscenariet
Det værste, der nu kan ske, er, at designbureauet, der skal lave det nye logo, falder i samme fælde, som de gjorde i Stockholm. Her har Historiska Museet et logo med ikonsæt af genstande fra samlingerne.


Historiska Museets logo

Det er en endnu værre idé end Willerslevs solvogn. Et sæt af ikoner vil bare fortælle, at museet har – surprise – mange genstande. Det vil signalere rarietetskabinet på den rigtig utidsvarende måde. For de indspiste er det rene Museum Wormianum en samling uden mål eller sigte.

It’s the implementation, stupid
Og hvorfor er det så, at Willerslev er så sur på logoet? Smag og behag er selvfølgelig forskellig – og det er helt fint. Men hvis jeg har ret i min antagelse om, at selve idéen bag logo og identitet stemmer overens med tanken bag Willerslevs formidlingstanker, hvad er det så, der driller direktøren? Hvorfor er han sur?

Jeg tror, at det skyldes, at Nationalmuseet som organisation har været dårlig til at bruge den energi og tænkning, der ligger bag identiteten. Der er jo ingen – ingen – der køber en liter mælk, fordi der er et Arla-logo på, eller investerer i en KIA pga. mærket på køleren. Og ingen går på Nationalmuseet pga. et nyt logo.

Det handler om, hvad brandet – mælk, bil eller museum – har på hjerte, kommunikerer, udtrykker og tilbyder af oplevelser. Og det er her, at folkene på museet ikke har formået at aktivere den samlede, fulde visuelle identitet, der består af så meget andet end logoet.

Implementering, implementering, implementering
Den visuelle identitet er ikke brugt kommunikativt – sådan som f.eks. SMK har været dygtige til. De tre N’er kan bruges præcis så vildt og opsigtsvækkende, som man har lyst til. De kan fås leopardprikkede, i pink, i flammer and ‘what have you’. Men pointen er, at det har Nationalmuseet ikke rigtig formået at omsætte.


E-Types står bag Nationalmuseets logo. Se hele e-Types logo-design her.

Logoet er blevet brugt som et stempel og ikke som en aktivering af den store fortælling om, hvorfor staten Danmark har et Nationalmuseum, der har viljen og evnerne til at sætte scenen for fortællingen om, hvordan Danmark blev til Danmark.

For eksempel er domænet tre år efter stadig natmus.dk, som mere er til indvortes brug end rettet mod de besøgende. Det burde jo åbenlyst være n.dk. Se også på Facebook- og Twitter-profilerne, der bruger – har du gættet det? Et udsnit af Solvognen.

Det er simpelthen ikke rigtigt, at identiteten ikke kan være sprællevende, men det kræver selvfølgelig en indsats. At museet i år så heller ikke evnede at levere basisvaren – en udstilling om 100-året for Danmarks salg af de vestindiske kolonier, synes jeg er case in point, men nok på en lidt anden hylde.

Gode ønsker
Så kære Rane Willerslev. Vi vil så gerne høre om, hvordan du tænker fremtidens Nationalmuseum. Vi er vilde med din energi, dit vid og din lyst til at blande dig. Men hvis du gerne vil ændre Nationalmuseet – og der er meget at tage fat på – så er logoet ikke der, du skal starte.

Du har sikkert god brandrådgivning fra bureauet, og noget af det, de helt sikkert vil sige, er, at alt godt design er afhængig af god eksekvering. Det ved de godt på direktionsgangene rundt omkring, så afsæt flere penge til jeres kommunikationsafdeling, så næste direktør ikke sidder i samme situation som du. Solvogn eller hashbod, nye koste eller ej.

Opgradér til BureaubizPro og bring dig og dine kollegaer i front med vigtig viden om bureaubranchen.

Med Bureaubiz Pro får du:

  • Bureaurapporten trykt og digitalt
  • Bureaubot
  • Bureaubattle
  • Bureaukalenderen
  • Brancheanalyser
  • Udbudsvagten
  • Fuld adgang til Bureaubiz.dk
  • To årlige gå-hjem-møder
Køb nu

Kommentarer

Nyheder

  • 8 bureauer prækvalificeret til DSBs udbud
    30. oktober
  • Tre partnere forlader bureau
    6. november
  • DBU: Carlsberg skylder kvajebajer
    14. november
  • &Co. tromler videre til The True Award
    4. november
  • Advice ude med anti-rygekampagne til knap 5 mio. kr.
    25. oktober
  • Udskiftning i toppen af mediabureau
    31. oktober
  • Dentsu Aegis Network opretter strategi & data enhed
    24. november
  • Black Friday indtager nettet
    24. november
  • Opsang: Branchepriser har et troværdighedsproblem
    23. november
  • 4 bureauer vandt en PR-Tiger
    23. november
  • Blom-tilbageblik: 30 år i branchen
    23. november
  • Hvor meget er bureauernes ideer værd?
    23. november

Få nyhedsbrev

Tilmeld dig Bureaubiz’ daglige brief om bureauer, reklamefolk og marketing - og kom bedre rundt om branchen hver dag.

Tilmeld nyhedsbrev

Kurser og konferencer

  • 28. november
    Præsentationsteknik: Brænd igennem med dit budskab
  • 7. december
    Kom godt i gang med mobilvideo
  • 1. februar
    Skab en sammenhængende PA- og PR-indsats
  • 8. februar
    Få mere ud af Facebook
  • Flere kurser
Facebook-annoncering: Skab målbare resultater
Få højere priser for bureauets arbejde

Jobs

  • Digitalt orienteret grafisk designer
    SKAT 24. november
  • Senior kommunikationsrådgiver
    Kompas Kommunikation 16. november
  • Kreativ, kundefokuseret AD’er
    Aros Kommunikation 14. november
  • Digital Account Manager
    OMD 12. november

pandora samling

pandora butik online
Pandora Gouden Ringen
preços das pulseiras pandora
pandora butik online
Pandora Anhänger

2011 : WHAT SCIENTIFIC CONCEPT WOULD IMPROVE EVERYBODY'S COGNITIVE TOOLKIT?

In the News [ 25 ]   |   Contributors [ 167 ]   |   View All Responses [ 166 ]

2011 : WHAT SCIENTIFIC CONCEPT WOULD IMPROVE EVERYBODY'S COGNITIVE TOOLKIT?

Daniel Kahneman Recipient, Nobel Prize in Economics, 2002; Eugene Higgins Professor of Psychology Emeritus, Princeton; Author, Thinking, Fast and Slow Focusing Illusion

"Nothing In Life Is As Important As You Think It Is, While You Are Thinking About It"

Education is an important determinant of income — one of the most important — but it is less important than most people think. If everyone had the same education, the inequality of income would be reduced by less than 10%. When you focus on education you neglect the myriad other factors that determine income. The differences of income among people who have the same education are huge.

Income is an important determinant of people's satisfaction with their lives, but it is far less important than most people think. If everyone had the same income, the differences among people in life satisfaction would be reduced by less than 5%.

Income is even less important as a determinant of emotional happiness. Winning the lottery is a happy event, but the elation does not last. On average, individuals with high income are in a better mood than people with lower income, but the difference is about 1/3 as large as most people expect. When you think of rich and poor people, your thoughts are inevitably focused on circumstances in which their income is important. But happiness depends on other factors more than it depends on income.

Paraplegics are often unhappy, but they are not unhappy all the time because they spend most of the time experiencing and thinking about other things than their disability. When we think of what it is like to be a paraplegic, or blind, or a lottery winner, or a resident of California we focus on the distinctive aspects of each of these conditions. The mismatch in the allocation of attention between thinking about a life condition and actually living it is the cause of the focusing illusion.

Marketers exploit the focusing illusion. When people are induced to believe that they "must have" a good, they greatly exaggerate the difference that the good will make to the quality of their life. The focusing illusion is greater for some goods than for others, depending on the extent to which the goods attract continued attention over time. The focusing illusion is likely to be more significant for leather car seats than for books on tape.

Politicians are almost as good as marketers in causing people to exaggerate the importance of issues on which their attention is focused. People can be made to believe that school uniforms will significantly improve educational outcomes, or that health care reform will hugely change the quality of life in the United States — either for the better or for the worse. Health care reform will make a difference, but the difference will be smaller than it appears when you focus on it.

Richard Dawkins Evolutionary Biologist; Emeritus Professor of the Public Understanding of Science, Oxford; Co-Author, with Yan Wong, The Ancestor’s Tale (Second Edition); Author, The Selfish Gene; The God Delusion; An Appetite For Wonder The Double-Blind Control Experiment

Not all concepts wielded by professional scientists would improve everybody's cognitive toolkit. We are here not looking for tools with which research scientists might benefit their science. We are looking for tools to help non-scientists understand science better, and equip them to make better judgments throughout their lives.

Why do half of all Americans believe in ghosts, three quarters believe in angels, a third believe in astrology, three quarters believe in Hell? Why do a quarter of all Americans and believe that the President of the United States was born outside the country and is therefore ineligible to be President? Why do more than 40 percent of Americans think the universe began after the domestication of the dog?

Let's not give the defeatist answer and blame it all on stupidity. That's probably part of the story, but let's be optimistic and concentrate on something remediable: lack of training in how to think critically, and how to discount personal opinion, prejudice and anecdote, in favour of evidence. I believe that the double-blind control experiment does double duty. It is more than just an excellent research tool. It also has educational, didactic value in teaching people how to think critically. My thesis is that you needn't actually do double-blind control experiments in order to experience an improvement in your cognitive toolkit. You only need to understand the principle, grasp why it is necessary, and revel in its elegance.

If all schools taught their pupils how to do a double-blind control experiment, our cognitive toolkits would be improved in the following ways:

1. We would learn not to generalise from anecdotes.

2. We would learn how to assess the likelihood that an apparently important effect might have happened by chance alone.

3. We would learn how extremely difficult it is to eliminate subjective bias, and that subjective bias does not imply dishonesty or venality of any kind. This lesson goes deeper. It has the salutary effect of undermining respect for authority, and respect for personal opinion.

4. We would learn not to be seduced by homeopaths and other quacks and charlatans, who would consequently be put out of business.

5. We would learn critical and sceptical habits of thought more generally, which not only would improve our cognitive toolkit but might save the world.

Vilayanur Ramachandran Neuroscientist; Professor & Director, Center for Brain and Cognition, UC, San Diego; Author, The Tell-Tale Brain Chunks With Handles

Do you need language — including words — for sophisticated thinking or do they merely facilitate thought? This question goes back to a debate between two Victorian scientists Max Mueller and Francis Galton.

A word that has made it into the common vocabulary of both science and pop culture is "paradigm" (and the converse "anomaly") having been introduced by the historian of science Thomas Kuhn. It is now widely used and misused both in Science and in other disciplines almost to the point where the original meaning is starting to be diluted. (This often happens to "memes" of human language and culture; which don't enjoy the lawful, particulate transmission of genes.) The word "paradigm" is now often used inappropriately — especially in the US — to mean any experimental procedure such as "The Stroop paradigm" or " A reaction time paradigm" or "fMR paradigm".

However, its appropriate use has shaped our culture in significant ways; even influencing the way scientists work and think. A more prevalent associated word is "skepticism", originating from the name of a Greek school of philosophy . This is used even more frequently and loosely than "anomaly" and "paradigm shift".

One can speak of reigning paradigms; what Kuhn calls normal science — What I cynically refer to as a "mutual admiration club trapped in a cul-de-sac of specialization". The club usually has its Pope(s), hierarchical priesthood, acolytes and a set of guiding assumptions and accepted norms that are zealously guarded almost with religious fervor. (They also fund each other and review each other’s papers and grants and give each other awards.)

This isn't entirely useless; its called "normal science" that grows by progressive accretion, employing the bricklayers rather than architects of science. If a new experimental observation (e.g. bacterial transformation; Ulcers cured by antibiotics) threatens to topple the edifice, its called an anomaly and the typical reaction of those who practice normal science is to ignore it or brush it under the carpet — a form of psychological denial surprisingly common among my colleagues.

This is not an unhealthy reaction since most anomalies turn out to be false alarms; the baseline probability of their survival as real" anomalies is small and whole careers have been wasted pursuing them (think "poly water", cold fusion".) Yet even such false anomalies serve the useful purpose of jolting scientists from their slumber by calling into question the basic axioms that drive their particular area of science. Conformist science feels cozy given the gregarious nature of humans and anomalies force periodic reality checks even if the anomaly turns out to be flawed.

More important, though, are genuine anomalies that emerge every now and then, legitimately challenging the status quo, forcing paradigm shifts and leading to scientific revolutions. Conversely, premature skepticism toward anomalies can lead to stagnation of science. One needs to be skeptical of anomalies but equally skeptical of the status quo if science is to progress.

I see an analogy between the process of science and of evolution by natural selection. For evolution, too, is characterized by periods of stasis (= normal science) punctuated by brief periods of accelerated change (= paradigm shifts) based on mutations (= anomalies) most of which are lethal (false theories) but some lead to the budding off of new species and phylogenetic trends (=paradigm shifts).

Since most anomalies are false alarms (spoon bending, telepathy, homeopathy) one can waste a lifetime pursuing them. So how does one decide which anomalies to invest in? Obviously one can do so by trial and error but that can be tedious and time consuming.

Let's take four well-known examples: (1) Continental drift; (2) Bacterial transformation; (3) cold fusion; (4) telepathy. All of these were anomalies when first discovered because they didn't fit the big picture of normal science at that time. The evidence that all the continents broke off and drifted away from a giant super-continent was staring at peoples faces — as Wagener noted in the early 20th century. (The coastlines coincided almost perfectly; certain fossils found on the east coast of Brazil were exactly the same as the ones on the west coast of Africa etc.) Yet it took fifty years for he idea to be accepted by the skeptics.

The second anomaly (2) — observed a decade before DNA and the genetic code — was that if you incubate one species of bacterium (pneumococcus A) with another species in a test tube (Pneumococcus B) then bacterium A becomes transformed into B! (Even the DNA —rich juice from B will suffice — leading Avery to suspect that heredity might have a chemical basis) Others replicated this. It was almost like saying put a pig and donkey into a room and two pigs emerge — yet the discovery was largely ignored for a dozen years. Until Watson and Crick pointed out the mechanism of transformation. The third anomaly — telepathy — is almost certainly a false alarm.

You will see a general rule of thumb emerging here. Anomalies (1) and (2) were not ignored because of lack of empirical evidence. Even a school child can see the fit between continental coastlines or similarity of fossils. It was ignored solely because it didn't fit the big picture — the notion of terra firma or a solid, immovable earth — and there was no conceivable mechanism that would allow continents to drift (until plate tectonics was discovered). Likewise (2) was repeatedly confirmed but ignored because it challenged the fundamental doctrine of biology — the stability of species. But notice that the third (telepathy) was rejected for two reasons; first, it didn't fit the big picture and second because it was hard to replicate

This gives us the recipe we are looking for; focus on anomalies that have survived repeated attempts to disprove experimentally, but are ignored by the establishment solely because you cant think of a mechanism. But don't waste time ones that have not been empirically confirmed despite repeated attempts (or the effect becomes smaller with each attempt — a red flag!)

"Paradigm" and "paradigm shift" have now migrated from science into pop culture (not always with good results) and I suspect many other words and phrases will follow suit — thereby enriching our intellectual and conceptual vocabulary and day-to-day thinking.

Indeed, words themselves are paradigms or stable "species" of sorts that evolve gradually with progressively accumulating penumbras of meaning, or sometimes mutate into new words to denote new concepts. These can then consolidate into chunks with "handles" (names) for juggling ideas around generating novel combinations. As a behavioral neurologist I am tempted to suggest that such crystallization of words and juggling them is unique to humans and it occurs in brain areas in and near the left TPO (temporal-parietal-occipital junction). But that's pure speculation.

Richard H. Thaler Father of Behavioral Economics; Recipient, 2017 Nobel Memorial Prize in Economic Science; Director, Center for Decision Research, University of Chicago Graduate School of Business; Author, Misbehaving Aether

I recently posted a question in this space asking people to name their favorite example of a wrong scientific belief. One of my favorite answers came from Clay Shirky. Here is an excerpt:

The existence of ether, the medium through which light (was thought to) travel. It was believed to be true by analogy — waves propagate through water, and sound waves propagate through air, so light must propagate through X, and the name of this particular X was ether.

It's also my favorite because it illustrates how hard it is to accumulate evidence for deciding something doesn't exist. Ether was both required by 19th century theories and undetectable by 19th century apparatus, so it accumulated a raft of negative characteristics: it was odorless, colorless, inert, and so on.

Several other entries (such as the "force of gravity") shared the primary function of ether: they were convenient fictions that were able to "explain" some otherwise ornery facts. Consider this quote from Max Pettenkofer, the German chemist and physician, is disputing the role of bacteria as a cause of the cholera. "Germs are of no account in cholera! The important thing is the disposition of the individual."

So in answer to the current question I am proposing that we now change the usage of the word Aether, using the old spelling, since there is no need for a term that refers to something that does not exist. Instead, I suggest we use that term to describe the role of any free parameter used in a similar way: that is, Aether is the thing that makes my theory work. Replace the word disposition with Aether in Pettenkofer's sentence above to see how it works.

Often Aetherists (theorists who rely on an Aether variable) think that their use of the Aether concept renders the theory untestable. This belief is often justified during their lifetimes, but then along comes clever empiricists such as Michelson and Morley and last year's tautology become this year's example of a wrong theory.

Aether variables are extremely common in my own field of economics. Utility is the thing you must be maximizing in order to render your choice rational.

Both risk and risk aversion are concepts that were once well defined, but are now in danger of becoming Aetherized. Stocks that earn surprisingly high returns are labeled as risky, because in the theory, excess returns must be accompanied by higher risk. If, inconveniently, the traditional measures of risk such as variance or covariance with the market are not high, then the Aetherists tell us there must be some other risk; we just don't know what it is.

Similarly, traditionally the concept of risk aversion was taken to be a primitive; each person had a parameter, gamma, that measured her degree of risk aversion. Now risk aversion is allowed to be time varying, and Aetherists can say with a straight face that the market crashes of 2001 and 2008 were caused by sudden increases in risk aversion. (Note the direction of the causation. Stocks fell because risk aversion spiked, not vice versa.)

So, the next time you are confronted with such a theory, I suggest substituting the word Aether for the offending concept. Personally, I am planning to refer to the time-varying variety of risk aversion as Aether aversion.

Brian Eno Artist; Composer; Recording Producer: U2, Coldplay, Talking Heads, Paul Simon; Recording Artist Ecology

That idea, or bundle of ideas, seems to me the most important revolution in general thinking in the last 150 years. It has given us a whole new sense of who we are, where we fit, and how things work. It has made commonplace and intuitive a type of perception that used to be the province of mystics — the sense of wholeness and interconnectedness.

Beginning with Copernicus, our picture of a semi-divine humankind perfectly located at the centre of The Universe began to falter: we discovered that we live on a small planet circling a medium sized star at the edge of an average galaxy. And then, following Darwin, we stopped being able to locate ourselves at the centre of life. Darwin gave us a matrix upon which we could locate life in all its forms: and the shocking news was that we weren't at the centre of that either — just another species in the innumerable panoply of species, inseparably woven into the whole fabric (and not an indispensable part of it either). We have been cut down to size, but at the same time we have discovered ourselves to be part of the most unimaginably vast and beautiful drama called Life.

Before ''ecology'' we understood the world in the metaphor of a pyramid — a heirarchy with God at the top, Man a close second and, sharply separated, a vast mass of life and matter beneath. In that model, information and intelligence flowed in one direction only — from the intelligent top to the ''base'' bottom — and, as masters of the universe, we felt no misgivings exploiting the lower reaches of the pyramid.

The ecological vision has changed that: we now increasingly view life as a profoundly complex weblike system, with information running in all directions, and instead of a single heirarchy we see an infinity of nested-together and co-dependent heirarchies — and the complexity of all this is such to be in and of itself creative. We no longer need the idea of a superior intelligence outside of the system — the dense field of intersecting intelligences is fertile enough to account for all the incredible beauty of ''creation''.

The ''ecological'' view isn't confined to the organic world. Along with it comes a new understanding of how intelligence itself comes into being. The classical picture saw Great Men with Great Ideas...but now we tend to think more in terms of fertile circumstances where uncountable numbers of minds contribute to a river of innovation. It doesn't mean we cease to admire the most conspicuous of these — but that we understand them as effects as much as causes. This has ramifications for the way we think about societal design, about crime and conflict, education, culture and science.

That in turn leads to a re-evaluation of the various actors in the human drama. When we realise that the cleaners and the bus drivers and the primary school teachers are as much a part of the story as the professors and the celebrities, we will start to accord them the respect they deserve.

J. Craig Venter A leading scientist of the 21st century for Genomic Sciences; Co-Founder, Chairman, Synthetic Genomics, Inc.; Founder, J. Craig Venter Institute; Author, A Life Decoded We Are Not Alone In The Universe

I cannot imagine any single discovery that would have more impact on humanity than the discovery of life outside of our solar system. There is a human-centric, Earth-centric view of life that permeates most cultural and societal thinking. Finding that there are multiple, perhaps millions of origins of life and that life is ubiquitous throughout the universe will profoundly affect every human.

We live on a microbial planet. There are one million microbial cells per cubic centimeter of water in our oceans, lakes and rivers; deep within the Earth's crust and throughout our atmosphere. We have more than 100 trillion microbes on and in each of us. The Earth's diversity of life would have seemed like science fiction to our ancestors. We have microbes that can withstand millions of Rads of ionizing radiation; such strong acid or base that it would dissolve our skin; microbes that grow in ice and microbes that grow and thrive at temperatures exceeding 100 degrees C. We have life that lives on carbon dioxide, on methane, on sulfur, or on sugar. We have sent trillions of bacteria into space over the last few billion years and we have exchanged material with Mars on a constant basis, so it would be very surprising if we do not find evidence of microbial life in our solar system, particularly on Mars.

The recent discoveries by Dimitar Sasselov and colleagues of numerous Earth and super-Earth-like planets outside our solar system, including water worlds, greatly increases the probability of finding life. Sasselov estimates approximately 100,000 Earth and super-Earths within our own galaxy. The universe is young so wherever we find microbial life there will be intelligent life in the future.

Expanding our scientific reach further into the skies will change us forever.

Martin Rees Former President, The Royal Society; Emeritus Professor of Cosmology & Astrophysics, University of Cambridge; Fellow, Trinity College; Author, From Here to Infinity Deep Time And The Far Future

We need to extend our time-horizons. Especially, we need deeper and wider awareness that far more time lies ahead than has elapsed up till now.

Our present biosphere is the outcome of more than four billion years of evolution; and we can trace cosmic history right back to a "big bang" that happened about 13.7 billion years ago. The stupendous time-spans of the evolutionary past are now part of common culture and understanding — even though the concept may not yet have percolated all parts of Kansas, and Alaska.

But the immense time-horizons that stretch ahead — though familiar to every astronomer — haven't permeated our culture to the same extent. Our Sun is less than half way through its life. It formed 4.5 billion years ago, but it's got 6 billion more before the fuel runs out. It will then flare up, engulfing the inner planets and vaporising any life that might then remain on Earth. But even after the Sun's demise, the expanding universe will continue — perhaps for ever — destined to become ever colder, ever emptier. That, at least, is the best long range forecast that cosmologists can offer, though few would lay firm odds on what may happen beyond a few tens of billions of years.

Awareness of the "deep time" lying ahead is still not pervasive. Indeed, most people — and not only those for whom this view is enshrined in religious beliefs —envisage humans as in some sense the culmination of evolution. But no astronomer could believe this; on the contrary, it would be equally plausible to surmise that we are not even at the halfway stage. There is abundant time for posthuman evolution, here on Earth or far beyond, organic or inorganic, to give rise to far more diversity, and even greater qualitative changes, than those that have led from single-celled organisms to humans. Indeed this conclusion is strengthened when we realise that future evolution will proceed not on the million-year timescale characteristic of Darwinian selection, but at the much accelerated rate allowed by genetic modification and the advance of machine intelligence (and forced by the drastic environmental pressures that would confront any humans who were to construct habitats beyond the Earth.

Darwin himself realised that "No living species will preserve its unaltered likeness into a distant futurity". We now know that "futurity" extends far further, and alterations can occur far faster — than Darwin envisioned. And we know that the cosmos, through which life could spread, is far more extensive and varied than he envisaged. So humans are surely not the terminal branch of an evolutionary tree, but a species that emerged early in cosmic history, with special promise for diverse evolution. But this is not to diminish their status. We humans are entitled to feel uniquely important as the first known species with the power to mould its evolutionary legacy.

Mahzarin Banaji Psychologist; Richard Clarke Cabot Professor of Social Ethics, Department of Psychology, Harvard University; Co-author, Blind Spot: Hidden Biases of Good People A Solution for Collapsed Thinking: Signal Detection Theory

We perceive the world through our senses. The brain-mediated data we receive in this way form the basis of our understanding of the world. From this become possible the ordinary and exceptional mental activities of attending, perceiving, remembering, feeling, and reasoning. Via these mental processes we understand and act on the material and social world.

In the town of Pondicherry in South India, where I sit as I write this, many do not share this assessment. There are those, including some close to me, who believe there are extrasensory paths to knowing the world that transcend the five senses, that untested "natural" foods and methods of acquiring information are superior to those based in evidence. On this trip, for example, I learned that they believe that a man has been able to stay alive without and caloric intake for months (although his weight falls, but only when he is under scientific observation).

Pondicherry is an Indian Union Territory that was controlled by the French for 300 years (staving off the British in many a battle right outside my window) and held on to until a few years after Indian independence. It has, in addition to numerous other points of attraction, become a center for those who yearn for spiritual experience, attracting many (both whites and natives) to give up their worldly lives to pursue the advancement of the spirit, to undertake bodily healing, and to invest in good works on behalf of a larger community.

Yesterday, I met a brilliant young man who had worked as a lawyer for eight years who now lives in the ashram and works in their book sales division. Sure, you retort, the profession of the law would turn any good person toward spirituality but I assure you that that the folks here have given up wealth and professional life of a wide variety of sorts to pursue this manner of life. The point is that seemingly intelligent people seem to crave non-rational modes of thinking and the Edge question this years forced me to think not only about the toolkit of the scientist but every person. 

I do not mean to pick on any one city, and certainly not this unusual one in which so much good effort is put towards the arts and culture and on social upliftment of the sort we would admire. But this is a town that also attracts a particular type of European, American, and Indian — those whose minds seem more naturally prepared to believe that unprocessed "natural" herbs do cure cancer and that standard medical care is to be avoided (until one desperately needs chemo), that Tuesdays are inauspicious for starting new projects, that particular points in the big toe control the digestive system, that the position of the stars at the time of their birth led them to Pondicherry through an inexplicable process emanating from a higher authority and through a vision from "the mother", a deceased French woman, who dominates the ashram and surrounding area in death more than many successful politicians ever do in their entire lives.

These types of beliefs may seem extreme but they are not considered as such in most of the world. Change the content and the underlying false manner of thinking is readily observed just about anywhere — the new 22 inches of snow that has fallen where I live in the United States while I'm away will no doubt bring forth beliefs of a god angered by crazy scientists toting global warming.

As I contemplate the single most powerful tool that could be put into the heads of every growing child and every adult seeking a rational path, scientists included, it is the simple and powerful concept of "signal detection". In fact, the Edge question this year happens to be one I've contemplated for a while — should anybody ever ask such a question, the answer I've known would be an easy one: I use Green & Swets Signal detection theory and Psychophysics as the prototype, although the idea has its origins in earlier work among scientists concerned with the fluctuations of photons and their influence on visual detection and sound waves and their influence on audition.

The idea underlying the power of signal detection theory is simple: The world gives noisy data, never pure. Auditory data, for instance, are degraded for a variety of reasons having to do with the physical properties of the communication of sound. The observing organism has properties that further affect how those data will be experienced and interpreted, such as ability (e.g., a person's auditory acuity), the circumstances under which the information is being processed (e.g., during a thunderstorm), and motivation (e.g., disinterest). Signal detection theory allows us to put both aspects of the stimulus and the respondent together to understand the quality of the decision that will result given the uncertain conditions under which data are transmitted, both physically and psychologically.

To understand the crux of signal detection theory, each event of any data impinging on the receiver (human or other) is coded into four categories, providing a language to describe the decision:

    Did the event occur?       Yes No   Yes Hit False Alarm Did the received
detect it?
        No Miss

Correct Rejection

Hit: A signal is present and the signal is detected (correct response)

False Alarm: No signal is presented but a signal is detected (incorrect response)

Miss: A signal is present but no signal is detected (incorrect response)

Correct Rejection: No signal is present and no signal is detected (correct response)

If the signal is clear, like a bright light against a dark background, the decision maker has good visual acuity and is motivated to watch for the signal, we should see a large number of Hits and Correct Rejections and very few False Alarms and Misses. As these properties change, so does the quality of the decision. Whether the stimulus is a physical one like a light or sound, or a piece of information requiring an assessment about its truth, information is almost always deviates from goodness.

It is under such ordinary conditions of uncertainty that signal detection theory yields a powerful way to assess the stimulus and respondent qualities including the respondent's idiosyncratic criterion (or cutting score, "c") for decision-making. The criterion is the place along the distribution at which point the respondent switches from the saying "no" to a "yes".

The applications of signal detection theory have been in areas as diverse as locating objects by sonar, the quality of remembering, the comprehension of language, visual perception, consumer marketing, jury decisions, price predictions in financial markets, and medical diagnoses.

The reason signal detection theory should be in the toolkit of every scientist is because it provides a mathematically rigorous framework to understand the nature of decision processes. The reason its logic should be in the toolkit of every thinking person is because it forces a completion of the four cells when analyzing the quality of any statement such as "Good management positions await Saggitarius this week".

Stewart Brand Founder, The Whole Earth Catalog; Co-founder, The Well; Co-Founder, The Long Now Foundation; Author, Whole Earth Discipline

Microbes Run the World

That opening sentence of The New Science of Metagenomics sounds reveille for a new way of understanding biology, and maybe of understanding society as well.

The breakthrough was shotgun sequencing of DNA, the same technology that gave us the human genome years ahead of schedule. Starting in 2003, Craig Venter and others began sequencing large populations of bacteria. The thousands of new genes they found (double the total previously discovered) showed what proteins the genes would generate and therefore what function they had, and that began to reveal what the teeming bacteria were really up to. This "meta"-genomics revolutionized microbiology, and that revolution will reverberate through the rest of biology for decades.

Microbes make up 80 percent of all biomass, says Carl Woese. In one fifth of a teaspoon of seawater there's a million bacteria (and 10 million viruses), Craig Venter says, adding, "If you don't like bacteria, you're on the wrong planet. This is the planet of the bacteria." That means most of the planet's living metabolism is microbial. When James Lovelock was trying to figure out where the gases come from that make the Earth's atmosphere such an artifact of life (the Gaia Hypothesis), it was microbiologist Lynn Margulis who had the answer for him. Microbes run our atmosphere. They also run much of our body. The human microbiome in our gut, mouth, skin, and elsewhere, harbors 3,000 kinds of bacteria with 3 million distinct genes. (Our own cells struggle by on only 18,000 genes or so.) New research is showing that our microbes-on-board drive our immune systems and important portions of our digestion.

Microbial evolution, which has been going on for over 3.6 billion years, is profoundly different from what we think of as standard Darwinian evolution, where genes have to pass down generations to work slowly through the selection filter. Bacteria swap genes promiscuously within generations. They have three different mechanisms for this "horizontal gene transfer" among wildly different kinds of bacteria, and thus they evolve constantly and rapidly. Since they pass on the opportunistically acquired genes to their offspring, what they do on an hourly basis looks suspiciously Lamarckian — the inheritance of acquired characteristics.

Such routinely transgenic microbes show that there's nothing new, special, or dangerous about engineered GM crops. Field biologists are realizing that the the biosphere is looking like what some are calling a pangenome,  an interconnected network of continuously circulated genes that is a superset of all the genes in all the strains of a species that form. Bioengineers in the new field of synthetic biology are working directly with the conveniently fungible genes of microbes.

This biotech century will be microbe enhanced and maybe microbe inspired. "Social Darwinism" turned out to be a bankrupt idea. The term "cultural evolution" never meant much, because the fluidity of memes and influences in society bears no relation to the turgid conservatism of standard Darwinian evolution. But "social microbialism" might mean something as we continue to explore the fluidity of traits and vast ingenuity of mechanisms among microbes — quorum sensing, biofilms, metabolic bucket brigades, "lifestyle genes," and the like.

Confronting a difficult problem we might fruitfully ask, "What would a microbe do?"

Stefano Boeri Architect; Professor, Politecnico of Milan; Visiting Professor at Harvard GSD; Editor-in-Chief, Abitare magazine Proxemic of Urban Sexuality

In every room, in every house, in every street, in every city, movements, relations and spaces are also defined with regards to logics of attraction-repulsion between the sexuality of individuals.

Even the most insurmountable ethnic or religious barriers can suddenly disappear with the furor of an intercourse; even the warmest and cohesive community can rapidly dissolve in absence of erotic tension.

To understand how our cosmopolitan and multi-gendered cities work, today we need a Proxemic of Urban Sexuality.

Nigel Goldenfeld Physicist, University of Illinois at Urbana-Champaign Because

When you are facing in the wrong direction, progress means walking backwards. History suggests that our world view undergoes disruptive change not so much when science adds new concepts to our cognitive toolkit, but when it takes away old ones. The sets of intuitions that have been with us since birth define our scientific prejudices, and not only are poorly-suited to the realms of the very large and very small, but also fail to describe everyday phenomena. If we are to identify where the next transformation of our world view will come from, we need to take a fresh look at our deep intuitions. In the two minutes that it takes you to read this essay, I am going to try and rewire your basic thinking about causality.

Causality is usually understood as meaning that there is a single, preceding cause for an event. For example in classical physics, a ball may be flying through the air, because of having been hit by a tennis racket. My 16 year-old car always revs much too fast, because the temperature sensor wrongly indicates that the engine temperature is cold, as if the car was in start-up mode. We are so familiar with causality as an underlying feature of reality that we hard-wire it into the laws of physics. It might seem that this would be unnecessary, but it turns out that the laws of physics do not distinguish between time going backwards and time going forwards. And so we make a choice about which sort of physical law we would like to have.

However, complex systems, such as financial markets or the Earth's biosphere, do not seem to obey causality. For every event that occurs, there are a multitude of possible causes, and the extent to which each contributes to the event is not clear, not even after the fact! One might say that there is a web of causation. For example, on a typical day, the stock market might go up or down by some fraction of a percentage point. The Wall Street Journal might blithely report that the stock market move was due to "traders taking profits" or perhaps "bargain-hunting by investors". The following day, the move might be in the opposite direction, and a different, perhaps contradictory, cause will be invoked. However, for each transaction, there is both a buyer and a seller, and their world views must be opposite for the transaction to occur. Markets work only because there is a plurality of views. To assign single or dominant cause to most market moves is to ignore the multitude of market outlooks and to fail to recognize the nature and dynamics of the temporary imbalances between the numbers of traders who hold these differing views.

Similar misconceptions abound elsewhere in public debate and the sciences. For example, are there single causes for diseases? In some cases, such as Huntingdon's disease, the cause can be traced to a unique factor, in this case extra repetitions of a particular nucleotide sequence at a particular location in an individual's DNA, coding for the amino acid glutamine. However, even in this case, the age of onset and the severity of the condition are also known to be controlled by environmental factors and interactions with other genes. The web of causation has been for many decades a well-worked metaphor in epidemiology, but there is still little quantitative understanding of how the web functions or forms. As Krieger poignantly asked in a celebrated 1994 essay, "Has anyone seen the spider?"

The search for causal structure is nowhere more futile than in the debate over the origin of organismal complexity: intelligent design vs. evolution. Fueling the debate is a fundamental notion of causality, that there is a beginning to life, and that such a beginning must have had a single cause. On the other hand, if there is instead a web of causation driving the origin and evolution of life, a skeptic might ask: has anyone seen the spider?

It turns out that there is no spider. Webs of causation can form spontaneously through the concatenation of associations between the agents or active elements in the system. For example, consider the Internet. Although a unified protocol for communication (TCP/IP etc) exists, the topology and structure of the Internet emerged during a frenzied build-out, as Internet service providers staked out territory in a gold-rush of unprecedented scale. Remarkably, once the dust began to settle, it became apparent that the statistical properties of the resulting Internet were quite special: the time delays for packet transmission, the network topology, and even the information transmitted exhibit fractal properties.

However, you look at the Internet, locally or globally, on short time scales or long, it looks exactly the same. Although the discovery of this fractal structure around 1995 was an unwelcome surprise, because standard traffic control algorithms as used by routers were designed assuming that all properties of the network dynamics would be random, the fractality is also broadly characteristic of biological networks. Without a master blueprint, the evolution of an Internet is subject to the same underlying statistical laws that govern biological evolution, and structure emerges spontaneously without the need for a controlling entity. Moreover, the resultant network can come to life in strange and unpredictable ways, obeying new laws whose origin cannot be traced to any one part of the network. The network behaves as a collective, not just the sum of parts, and to talk about causality is meaningless because the behavior is distributed in space and in time.

Between 2.42pm and 2.50pm on May 6 2010, the Dow-Jones Industrial Average experienced a rapid decline and subsequent rebound of nearly 600 points, an event of unprecedented magnitude and brevity. This disruption occurred as part of a tumultuous event on that day now known as the Flash Crash, which affected numerous market indices and individual stocks, even causing some stocks to be priced at unbelievable levels (e.g. Accenture was at one point priced at 1 cent).

With tick-by-tick data available for every trade, we can watch the crash unfold in slow motion, a film of a financial calamity. But the cause of the crash itself remains a mystery. The US Securities and Exchange Commission report on the flash crash was able to identify the trigger event (a $4 billion sale by a mutual fund), but could provide no detailed understanding of why this event caused the crash. The conditions that precipitate the crash were already embedded in the market's web of causation, a self-organized rapidly evolving structure created by the interplay of high frequency trading algorithms. The Flash Crash was the birth cry of a network coming to life, eerily reminiscent of Arthur C. Clarke's science fiction story "Dial F for Frankenstein", which begins "At 0150 GMT on December 1, 1975, every telephone in the world started to ring." I'm excited by the scientific challenge of understanding all this in detail, because … well, never mind. I guess I don't really know.

Dimitar D. Sasselov Professor of Astronomy, Harvard University; Director, Harvard Origins of Life Initiative; Author, The Life of Super-Earths The Other

The concept of 'otherness' or 'the Other' is about how a conscious human being perceives their own identity: "Who am I and how do I relate to others?"; a part of what defines the self and is constituent in self-consciousness. It is a philosophical concept widely used in psychology and social science. Recent advances in the life and physical sciences have opened the possibility for new and even unexpected expansions of this concept.

Starting with the map of the human genome, to the diploid human genomes of individuals, and to mapping humans' geographic spread, then moving back in time with the mapping of the Neanderthal genome, these are new tools to address the age-old problem of human unity and human diversity. Reading the 'life code' of DNA does not stop here – it places humans in the vast and colorful mosaic of Earth life. 'Otherness' is placed in totally new light. Our microbiomes – the trillions of microbes on and in each of us, that are essential to a person's physiology, become part of our self.

Astronomy and space science are intensifying the search for life on other planets – from Mars and the outer reaches of the Solar system, to Earth-like planets and super-Earths orbiting other stars. The chances of success may hinge on our understanding of the possible diversity of the chemical basis of life itself. 'Otherness': not among DNA-encoded species, but among life forms using different molecules to encode traits. Our 4-billion-years-old heritage of molecular innovation and design, versus 'theirs'. This is a cosmic first encounter that we might experience in our labs first. Last year's glimpse at JCVI-syn1.0 – the first bacteria controlled completely by a synthetic genome, is a prelude to this brave new field.

It is probably timely to ponder 'otherness' and its wider meaning yet again, as we embark on a new age of exploration. And as T.S. Eliot once predicted, we might arrive where we started and know our self for the first time.

Gary Marcus Professor of Psychology, Director NYU Center for Language and Music; Author, Guitar Zero Cognitive Humility

Hamlet may have said that human beings are noble in reason and infinite in faculty, but in reality — as four decades of experiments in cognitive psychology have shown — our minds are very finite, and far from noble. Knowing the limits of our minds can help us to make better reasoners.

Almost all of those limits start with a peculiar fact about human memory: although we are pretty good at storing information in our brains, we are pretty poor at retrieving that information. We can recognize photos from our high school yearbooks decades later—yet still find it impossible to remember what we had for breakfast yesterday. Faulty memories have been known to lead to erroneous eyewitness testimony (and false imprisonment), to marital friction (in the form of overlooked anniversaries), and even death (skydivers, for example have been known to forget to pull their ripcords — accounting, by one estimate, for approximately 6% of skydiving fatalities).

Computer memory is much more better than human memory because early computer scientists discovered a trick that evolution never did: organizing information according by assigning every memory to a sort of master map, in which each bit of information that is to be stored is assigned a specific, uniquely identifiable location in the computer's memory vaults. Human beings, in contrast. appear to lack such master memory maps, and instead retrieve information in far more haphazard fashion, by using clues (or cues) to what it's looking for, rather than knowing in advance where in the brain a given memory lies.

In consequence, our memories cannot be searched as systematically or as reliably as those of us a computer (or internet database). Instead, human memories are deeply subject to context. Scuba divers, for example, are better at remembering the words they study underwater when they are tested underwater (relative to when they were a tested on land), even if the words have nothing to do with the sea.

Sometimes this sensitivity to context is useful. We are better able to remember what we know about cooking when we are in the kitchen than when we are skiing, and vice versa.

But it also comes at a cost: when we need to remember something in a situation other than the one in which it was stored, it's often hard to retrieve it. One of the biggest challenges in education, for example, is to get children to take what they learn in school and apply it to real world situations, in part because context-driven memory means that what is learned in school tends to stay in school.

Perhaps the most dire consequence is that human beings tend almost invariably to be better at remembering evidence that is consistent with their beliefs than evidence that might disconfirm them. When two people disagree, it is often because their prior beliefs lead them to remember (or focus on) different bits of evidence. To consider something well, of course, is to evaluate both sides of an argument, but unless we also go the extra mile of deliberately forcing ourselves to consider alternatives—not something that comes naturally—we are more prone to recalling evidence consistent with a proposition than inconsistent with it.

Overcoming this mental weakness, known as confirmation bias, is a lifelong struggle; recognizing that we all suffer from it is a important first step.To the extent that we can beware of this limitation in our brains, we can try to work around it, compensating for our in-born tendencies towards self-serving and biased recollections by disciplining ourselves to consider not just the data that might fit with our own beliefs, but also the data that might lead other people to have beliefs that differ from our own.

Eric R. Weinstein Mathematician and Economist; Managing Director of Thiel Capital Kayfabe

The sophisticated "scientific concept" with the greatest potential to enhance human understanding may be argued to come not from the halls of academe, but rather from the unlikely research environment of professional wrestling.

Evolutionary biologists Richard Alexander and Robert Trivers have recently emphasized that it is deception rather than information that often plays the decisive role in systems of selective pressures. Yet most of our thinking continues to treat deception as something of a perturbation on the exchange of pure information, leaving us unprepared to contemplate a world in which fakery may reliably crowd out the genuine. In particular, humanity's future selective pressures appear likely to remain tied to economic theory which currently uses as its central construct a market model based on assumptions of perfect information.

If we are to take selection more seriously within humans, we may fairly ask what rigorous system would be capable of tying together an altered reality of layered falsehoods in which absolutely nothing can be assumed to be as it appears. Such a system, in continuous development for more than a century, is known to exist and now supports an intricate multi-billion dollar business empire of pure hokum. It is known to wrestling's insiders as "Kayfabe".

Because professional wrestling is a simulated sport, all competitors who face each other in the ring are actually close collaborators who must form a closed system (called "a promotion") sealed against outsiders. With external competitors generally excluded, antagonists are chosen from within the promotion and their ritualized battles are largely negotiated, choreographed, and rehearsed at a significantly decreased risk of injury or death. With outcomes predetermined under Kayfabe, betrayal in wrestling comes not from engaging in unsportsmanlike conduct, but by the surprise appearance of actual sporting behavior. Such unwelcome sportsmanship which "breaks Kayfabe" is called "shooting" to distinguish it from the expected scripted deception called "working".

Were Kayfabe to become part of our toolkit for the twenty-first century, we would undoubtedly have an easier time understanding a world in which investigative journalism seems to have vanished and bitter corporate rivals cooperate on everything from joint ventures to lobbying efforts. Perhaps confusing battles between "freshwater" Chicago macro economists and Ivy league "Saltwater" theorists could be best understood as happening within a single "orthodox promotion" given that both groups suffered no injury from failing (equally) to predict the recent financial crisis. The decades old battle in theoretical physics over bragging rights between the "string" and "loop" camps would seem to be an even more significant example within the hard sciences of a collaborative intra-promotion rivalry given the apparent failure of both groups to produce a quantum theory of gravity.

What makes Kayfabe remarkable is that it gives us potentially the most complete example of the general process by which a wide class of important endeavors transition from failed reality to successful fakery. While most modern sports enthusiasts are aware of wrestling's status as a pseudo sport, what few alive today remember is that it evolved out of a failed real sport (known as "catch" wrestling) which held its last honest title match early in the 20th century. Typical matches could last hours with no satisfying action, or end suddenly with crippling injuries to a promising athlete in whom much had been invested. This highlighted the close relationship between two paradoxical risks which define the category of activity which wrestling shares with other human spheres:

• A) Occasional but Extreme Peril for the participants.

• B) General: Monotony for both audience and participants.

Kayfabrication (the process of transition from reality towards Kayfabe) arises out of attempts to deliver a dependably engaging product for a mass audience while removing the unpredictable upheavals that imperil participants. As such Kayfabrication is a dependable feature of many of our most important systems which share the above two characteristics such as war, finance, love, politics and science.

Importantly, Kayfabe also seems to have discovered the limits of how much disbelief the human mind is capable of successfully suspending before fantasy and reality become fully conflated. Wrestling's system of lies has recently become so intricate that wrestlers have occasionally found themselves engaging in real life adultery following exactly behind the introduction of a fictitious adulterous plot twist in a Kayfabe back-story. Eventually, even Kayfabe itself became a victim of its own success as it grew to a level of deceit that could not be maintained when the wrestling world collided with outside regulators exercising oversight over major sporting events.

At the point Kayfabe was forced to own up to the fact that professional wrestling contained no sport whatsoever, it did more than avoid being regulated and taxed into oblivion. Wrestling discovered the unthinkable: its audience did not seem to require even a thin veneer of realism. Professional wrestling had come full circle to its honest origins by at last moving the responsibility for deception off of the shoulders of the performers and into the willing minds of the audience.

Kayfabe, it appears, is a dish best served client-side.

Neri Oxman Architect, Researcher, MIT; Founder, Materialecology It Ain't Necessarily So

Preceding the scientific method is a way of being in the world that defies the concept of a solid, immutable reality. Challenging this apparent reality in a scientific manner can potentially unveil a revolutionary shift in its representation and thus recreate reality itself. Such suspension of belief implies the temporary forfeiting of some explanatory power of old concepts and the adoption of a new set of assumptions in their place.

Reality is the state of things as they actually exist, rather than the state by which they may appear or thought to be — a rather ambiguous definition given our known limits to observation and comprehension of concepts and methods. This ambiguity, captured by the aphorism thatthings are not what they seem, and again with swing in Sportin' Life's songIt Ain't Necessarily So, is a thread that seems to consistently appear throughout the history of science and the evolution of the natural world. In fact, ideas that have challenged accepted doctrines and created new realities have prevailed in fields ranging from warfare to flight technology, from physics to medicinal discoveries.

Recall the battle between David and Goliath mentioned in Gershwin's song. The giant warrior, evidently unbeatable by every measure of reality, is at once defeated by a lyre-playing underdog who challenges this seemingly apparent reality by devising a nearly scientific and unconventional combat strategy.

The postulation that mighty opponents have feeble spots also holds true for the war against ostensibly incurable diseases. Edward Jenner's inoculation experiments with the cowpox virus to build immunity against the deadly scourge of smallpox gave rise to the vaccine that later helped prevent diseases such as Polio, Malaria and HIV. The very idea that an enemy — a disease — is to be overcome exclusively by brute force was defied by the counter-intuitive hypothesis that the disease itself — or a mild version of its toxins — might be internally memorized by the human immune system as a preventive measure.

Da Vinci's flying machine is another case in point. Challenging the myth of Icarus and its moral that humans should not attempt flying, Leonardo designs a hanger glider inspired by his studies into the structure-function relationships of bird wings. This is the first flying machine known to men on the basis of which our entire avionic industry has evolved.

Challenging what was assumed to be the nature of reality, conveniently supported by religious authorities, Copernicus disputes the Ptolemaic model of the heavens, which postulated the Earth at the center of the universe, by providing the heliocentric model with the Sun at the center of our solar system. The Scientific Revolution of the 16th century then followed, laying the foundations for modern science.