A short video, that explains what modern physics of time travel. I know it’s a good topic to discuss, but what ARE the realities of time travel? I’m not talking about BS time travellers etc… But the real physics behind it as we know it today. Enjoy the video, after all it’s short and most of all FREE!!!
Is evolutionary science due for a major overhaul – or is talk of ‘revolution’ misguided?
When researchers at Emory University in Atlanta trained mice to fear the smell of almonds (by pairing it with electric shocks), they found, to their consternation, that both the children and grandchildren of these mice were spontaneously afraid of the same smell. That is not supposed to happen. Generations of schoolchildren have been taught that the inheritance of acquired characteristics is impossible. A mouse should not be born with something its parents have learned during their lifetimes, any more than a mouse that loses its tail in an accident should give birth to tailless mice.
If you are not a biologist, you’d be forgiven for being confused about the state of evolutionary science. Modern evolutionary biology dates back to a synthesis that emerged around the 1940s-60s, which married Charles Darwin’s mechanism of natural selection with Gregor Mendel’s discoveries of how genes are inherited. The traditional, and still dominant, view is that adaptations – from the human brain to the peacock’s tail – are fully and satisfactorily explained by natural selection (and subsequent inheritance). Yet as novel ideas flood in from genomics, epigenetics and developmental biology, most evolutionists agree that their field is in flux. Much of the data implies that evolution is more complex than we once assumed.
Some evolutionary biologists, myself included, are calling for a broader characterisation of evolutionary theory, known as the extended evolutionary synthesis (EES). A central issue is whether what happens to organisms during their lifetime – their development – can play important and previously unanticipated roles in evolution. The orthodox view has been that developmental processes are largely irrelevant to evolution, but the EES views them as pivotal. Protagonists with authoritative credentials square up on both sides of this debate, with big-shot professors at Ivy League universities and members of national academies going head-to-head over the mechanisms of evolution. Some people are even starting to wonder if a revolution is on the cards.
In his book On Human Nature (1978), the evolutionary biologist Edward O Wilson claimed that human culture is held on a genetic leash. The metaphor was contentious for two reasons. First, as we’ll see, it’s no less true that culture holds genes on a leash. Second, while there must be a genetic propensity for cultural learning, few cultural differences can be explained by underlying genetic differences.
Nonetheless, the phrase has explanatory potential. Imagine a dog-walker (the genes) struggling to retain control of a brawny mastiff (human culture). The pair’s trajectory (the pathway of evolution) reflects the outcome of the struggle. Now imagine the same dog-walker struggling with multiple dogs, on leashes of varied lengths, with each dog tugging in different directions. All these tugs represent the influence of developmental factors, including epigenetics, antibodies and hormones passed on by parents, as well as the ecological legacies and culture they bequeath.
The struggling dog-walker is a good metaphor for how EES views the adaptive process. Does this require a revolution in evolution? Before we can answer this question, we need to examine how science works. The best authorities here are not biologists but philosophers and historians of science. Thomas Kuhn’s book The Structure of Scientific Revolutions (1962) popularised the idea that sciences change through revolutions in understanding. These ‘paradigm shifts’ were thought to follow a crisis of confidence in the old theory that arose through the accumulation of conflicting data.
Then there’s Karl Popper, and his conjecture that scientific theories can’t be proven but can be falsified. Consider the hypothesis: ‘All sheep are white.’ Popper maintained that no amount of positive findings consistent with this hypothesis could prove it to be correct, since one could never rule out the chance that a conflicting data-point might arise in the future; conversely, the observation of a single black sheep would decisively prove the hypothesis to be false. He maintained that scientists should strive to carry out critical experiments that could potentially falsify their theories.
Everything from diet to air pollution to parental behaviour can influence gene expression
While Kuhn and Popper’s ideas are well-known, they remain disputed and contentious in the eyes of philosophers and historians. Contemporary thinking in these fields is better captured by the Hungarian philosopher Imre Lakatos in The Methodology of Scientific Research Programmes (1978):
The history of science refutes both Popper and Kuhn: on close inspection both Popperian crucial experiments and Kuhnian revolutions turn out to be myths.
Popper’s arguments might make logical sense, but they don’t quite map on to how science works in the real world. Scientific observations are susceptible to errors of measurement; scientists are human beings and get attached to their theories; and scientific ideas can be fiendishly complex – all of which makes evaluating scientific hypotheses a messy business. Rather than accepting that our hypotheses might be wrong, we challenge the methodology (‘That sheep’s not black – your instruments are faulty’), dispute the interpretation (‘The sheep’s just dirty’), or come up with tweaks to our hypotheses (‘I meant domesticated breeds, not wild mouflon’). Lakatos called such fixes and fudges ‘auxiliary hypotheses’; scientists propose them to ‘protect’ their core ideas, so that they need not be rejected.
This sort of behaviour is clearly manifest in scientific debates over evolution. Take the idea that new features acquired by an organism during its life can be passed on to the next generation. This hypothesis was brought to prominence in the early 1800s by the French biologist Jean-Baptiste Lamarck, who used it to explain how species evolved. However, it has long been regarded as discredited by experiment – to the point that the term ‘Lamarckian’ has a derogatory connotation in evolutionary circles, and any researchers expressing sympathy for the idea effectively brand themselves ‘eccentric’. The received wisdom is that parental experiences can’t affect the characters of their offspring.
Except they do. The way that genes are expressed to produce an organism’s phenotype – the actual characteristics it ends up with – is affected by chemicals that attach to them. Everything from diet to air pollution to parental behaviour can influence the addition or removal of these chemical marks, which switches genes on or off. Usually these so-called ‘epigenetic’ attachments are removed during the production of sperm and eggs cells, but it turns out that some escape the resetting process and are passed on to the next generation, along with the genes. This is known as ‘epigenetic inheritance’, and more and more studies are confirming that it really happens.
Let’s return to the almond-fearing mice. The inheritance of an epigenetic mark transmitted in the sperm is what led the mice’s offspring to acquire an inherited fear. In 2011, another extraordinary study reported that worms responded to exposure to a nasty virus by producing virus-silencing factors – chemicals that shut down the virus – but, remarkably, subsequent generations epigenetically inherited these chemicals via regulatory molecules (known as ‘small RNAs’). There are now hundreds of such studies, many published in the most prominent and prestigious journals. Biologists dispute whether epigenetic inheritance is truly Lamarckian or only superficially resembles it, but there is no getting away from the fact that the inheritance of acquired characteristics really does happen.
By Popper’s reasoning, a single experimental demonstration of epigenetic inheritance – like a single black sheep – should suffice to convince evolutionary biologists that it’s possible. Yet, by and large, evolutionary biologists have not rushed to change their theories. Rather, as Lakatos anticipated, we have come up with auxiliary hypotheses that allow us to retain our long-held beliefs (ie, that inheritance is pretty much explained by the transmission of genes across generations). These include the ideas that epigenetic inheritance is rare, that it does not affect functionally important traits, that it is under genetic control, and that it is too unstable to underpin the spread of traits through selection.
Unfortunately for the traditionalists, none of these attempts to bracket epigenetic inheritance look credible. It is now known to be widespread in nature, with more and more examples appearing every day. It affects functionally important features such as fruit size, flowering time and root growth in plants – and while only a fraction of epigenetic variants are adaptive, that’s no less true of genetic variation, so it’s hardly grounds for dismissal. In some systems where rates of epigenetic change have been measured carefully, such as the plant Arabidopsis thaliana, the pace has been found to be low enough to be selected and lead to cumulative evolution. Mathematical models have shown that systems with epigenetic inheritance evolve differently from those solely reliant on genetic inheritance – for instance, selection on epigenetic marks can cause changes in gene frequencies. There’s no longer any doubt that epigenetic inheritance pushes us to think about evolution in a different way.
Epigenetics is only part of the story. Through culture and society, all of us inherit knowledge and skills acquired by our parents. Evolutionary biologists have accepted this for at least a century, but until recently it was considered to be restricted to humans. That’s no longer tenable: creatures across the animal kingdom learn socially about diet, feeding techniques, predator avoidance, communication, migration, and mate and breeding-site choices. Hundreds of experimental studies have demonstrated social learning in mammals, birds, fish and insects.
In a single mating season, ‘fads’ can develop in the qualities that individuals find attractive in their partners
Among the most compelling data are studies that cross-fostered great tits and blue tits. When raised by the other species, these birds shifted numerous aspects of their behaviour towards the behaviour of their foster parent (including the height in trees at which they foraged, their choice of prey, foraging method, calls and songs, and even their choice of mate). Everyone had assumed that the behavioural differences between these two species were genetic, but it turns out that many are cultural traditions.
Animal cultures can be sustained for surprisingly long periods. Archaeological remains show that chimpanzees have used stone tools to crack open nuts for at least 4,300 years. However, as for epigenetic inheritance, it would be a mistake to assume that animal culture must exhibit gene-like stability to be evolutionarily important. In the course of a single mating season, ‘fads’ can develop in the qualities that individuals find attractive in their partners; the process has been experimentally demonstrated in fruit flies, fish, birds and mammals, and mathematical models show that such ‘mate-choice copying’ can strongly affect sexual selection.
Another illustration comes from studies of birdsong. When young male birds learn their songs (usually from nearby adult males), they modify the natural-selection pressures of genes that affect how songs are acquired (in males) and which songs are preferred (in females). The cultural transmission of song is known to promote the evolution of brood parasitism – where birds, such as cuckoos, don’t make nests but lay eggs in other birds’ nests – as some brood parasites rely on cultural learning to figure out whom to mate with. It also facilitates speciation, since preferences for particular birdsong ‘dialects’ help to maintain genetic differences between populations.
Likewise, the diverse, culturally learned foraging traditions of orcas – where different groups specialise in particular types of fish, seals or dolphins – is thought to be driving them to split into several species. Of course, culture reaches its zenith in our own species, where it is now well-established that our cultural habits have been a major source of natural selection on our genes. Dairy farming and milk consumption generated selection for a genetic variant that increased lactase (the enzyme that metabolises dairy products), while starchy agricultural diets favoured increased amylase (the corresponding enzyme that breaks down starch).
All this complexity can’t be reconciled with a strictly genetic currency for adaptive evolution, as many biologists now acknowledge. Rather, it points to an evolutionary process in which genomes (over hundreds to thousands of generations), epigenetic modifications and inherited cultural factors (over several, perhaps tens or hundreds of generations), and parental effects (over single-generation timespans) collectively inform how organisms adapt. These extra-genetic kinds of inheritance give organisms the flexibility to make rapid adjustments to environmental challenges, dragging genetic change in their wake – much like a rowdy pack of dogs.
Despite the excitement of all the new data, it’s unlikely to trigger an evolution revolution for the simple reason that science doesn’t work that way – at least, not evolutionary science. Kuhnian paradigm shifts, like Popper’s critical experiments, are closer to myths than reality. Look back at the history of evolutionary biology, and you will see nothing that resembles a revolution. Even Charles Darwin’s theory of evolution through natural selection took approximately 70 years to become widely accepted by the scientific community, and at the turn of the 20th century was viewed with considerable skepticism. Over the following decades, new ideas appeared, they were critically evaluated by the scientific community, and gradually became integrated with pre-existing knowledge. By and large, evolutionary biology was updated without experiencing great periods of ‘crisis’.
The same holds for the present. Epigenetic inheritance does not disprove genetic inheritance, but shows it to be just one of several mechanisms through which traits are inherited. I know of no biologist who wants to rip up the textbooks, or throw out natural selection. The debate in evolutionary biology concerns whether we want to extend our understanding of the causes of evolution, and whether that changes how we think about the process as a whole. In this respect, what is going on is ‘normal science’.
Why, then, are traditionally minded evolutionary biologists complaining about the misguided evolutionary radicals that lobby for paradigm shift? Why are journalists writing articles about scientists calling for a ‘revolution’ in evolutionary biology? If nobody actually wants a revolution, and scientific revolutions rarely happen anyway, what’s all the fuss about? The answer to these questions provides a fascinating insight into the sociology of evolutionary biology.
Revolution in evolution is a misattribution – a myth propagated by an unlikely alliance of conservative-minded evolutionists, creationists and the press. I don’t doubt that there are a small number of genuine, revolutionarily minded evolutionary radicals out there, but the vast majority of researchers working towards an extended evolutionary synthesis are simply ordinary, hardworking evolutionary biologists.
We all know that sensationalism sells newspapers, and articles that portend a major upheaval make for better copy. Creationists and advocates of ‘intelligent design’ also feed this impression, with propaganda that exaggerates differences of opinion among evolutionists and gives a false impression that the field of evolutionary biology is in turmoil. What’s more surprising is how commonly conservative-minded biologists play the ‘We’re under attack!’ card against their fellow evolutionists. Portraying intellectual opponents as extremist, and telling people that they are being attacked, are age-old rhetorical tricks to win debate or allegiance.
I had always associated such games with politics, not science, but now realise I was naive. Some of the behind-the-scenes shenanigans I have witnessed, seemingly designed to prevent new ideas from spreading by fair means or foul, have truly shocked me, and are out of kilter with practice in other fields that I know. Scientists, too, have careers and legacies at stake, as well as struggles for funding, power and influence. I worry that the traditionalists’ rhetoric is backfiring, creating confusion and inadvertently fuelling creationism by exaggerating division. Too many reputable scientists feel the need for change in evolutionary biology for all to be credibly dismissed as fringe elements.
If the extended evolutionary synthesis is not a call for revolution in evolution, then what is it, and why do we need it? To answer these questions, we need to recognise what Kuhn got right – namely, that every scientific field possesses shared ways of thinking, or ‘conceptual frameworks’. Evolutionary biology is no different, and our shared values and assumptions influence what data is collected, how that data is interpreted, and what factors are built into explanations for how evolution works.
That is why pluralism in science is healthy. Lakatos stressed that alternative conceptual frameworks – what he called different ‘research programmes’ – can be valuable to the extent that they encourage new hypotheses to be generated and tested, or lead to novel insights. That is the primary function of the EES: to nurture, or even open up, new lines of enquiry, and new productive ways of thinking.
What if some ways of building a fish are just more probable than others?
A good example concerns what’s known as ‘developmental bias’. Consider the intriguing cichlid fishes of East Africa. For tens, perhaps hundreds, of the cichlid species in Lake Malawi, there exists an independently evolved, ‘duplicate’ species in Lake Tanganyika, with a strikingly similar body shape and way of feeding. Such likenesses are usually explained through convergent evolution: random genetic variation has bubbled up as usual, but similar environmental conditions have selected the genes to produce equivalent results. The way that organisms grow and develop might limit which traits arise, but the variation itself is assumed to be essentially random.
However, the extraordinary level of parallel evolution seen in these two lakes suggests that something else might be going on. What if some ways of building a fish are just more probable than others? What if trait variation skews towards certain solutions? Selection would still be part of the explanation, but parallel evolution would be much more likely.
Cheek teeth (molars) in mammals provide some of the most convincing data for bias. Studies show that it’s possible to use a mathematical model, based on laboratory mice, to predict the size and number of teeth in a sample of 29 other rodent species. Rather than being free to make any shape or number of teeth, it appears that natural selection is pushing species along a highly specific pathway created by the mechanisms of development. The existence of exceptions – rodents such as voles with different ratios of teeth – demonstrates that the old way of thinking (that developmental ‘constraints’ restrict selection) isn’t quite right. The effect of development is both more subtle and more interesting: developmental mechanisms bias the landscape for selection, and help to determine which features evolve.
Such studies are exciting as they help to make evolutionary biology a more predictive science. Why, then, have these ideas received comparatively little attention until recently? We come back to conceptual frameworks. Historically, evolutionary biologists have treated bias in phenotypic variation as a ‘constraint’ – an explanation for why evolution or adaptation has not occurred. The way that organisms grow restricts what sorts of features it is possible or adaptive to possess. Traditionally minded evolutionists have been far more reticent to embrace a positive role for development as a cause of evolutionary direction and change.
It took a different perspective (in this instance, that of evolutionary developmental biology, so-called ‘evo devo’), to motivate this kind of experimentation. From the evo-devo perspective, bias partly explains what evolution and adaptation has occurred. Rodents’ teeth and fishes’ bodies look the way they do because the way that creatures grow makes those characteristics more likely to arise. Bias thus becomes a much more significant concept in evolutionary explanation. By bringing the phenomenon to the fore, the EES hopes it will be investigated.
The EES, at least as my collaborators and I frame it, is best viewed as an alternative research programme for evolutionary biology. Inspired by recent findings emerging within evolutionary biology and adjacent fields, the EES starts from the assumption that developmental processes play important roles as causes of novel (and potentially beneficial) phenotypic variation, causes of differences in fitness of those variants, and causes of inheritance. In contrast to how evolution has traditionally been conceived, in the EES the burden of creativity in evolution does not rest on natural selection alone. This alternative way of thinking is being used to generate fresh hypotheses and establish new research agendas. It’s early days, but there are already signs that this research is starting to yield dividends.
If evolution is not to be explained solely in terms of changes in gene frequencies; if previously rejected mechanisms such as the inheritance of acquired characteristics turn out to be important after all; and if organisms are acknowledged to bias evolution through development, learning and other forms of plasticity – does all this mean a radically different and profoundly richer account of evolution is emerging? No one knows: but from the perspective of our adapting dog-walker, evolution is looking less like a gentle genetic stroll, and more like a frantic struggle by genes to keep up with strident developmental processes.
First human frozen by cryogenics could be brought back to life ‘in just TEN years’, claims expert
Hundreds worldwide have had their corpses frozen in a cryogenic chamber.
They are preserved after death in the hope they can be revived in the future
An expert has claimed scientists could reanimate one of these corpses within the next ten years.
Human corpses frozen by cryogenics could be brought back to life in the next decade, an expert has claimed.
Around 350 people worldwide have had their corpse preserved at low temperatures immediately after death in the hope it can be revived in the future.
Dennis Kowalski, president of the Michigan-based Cryonics Institute – an organisation fronting the human freezing process – has now claimed scientists could reanimate one of these corpses within the next ten years.
Human corpses frozen by cryogenics could be brought back to life in the next decade, an expert has claimed. Around 350 people worldwide have had their corpse preserved at low temperatures immediately after death in the hope it can be revived in the future (file photo).
Speaking to the Daily Star, Mr Kowalski, 49, said: ‘If you take something like CPR, that would have seemed unbelievable 100 years ago. Now we take that technology for granted.
‘Cryonically bringing someone back to life should definitely be doable in 100 years, but it could be as soon as ten.’
Mr Kowalksi’s Cryonics Institute has almost 2,000 people signed up to be frozen after they die.
The firm already has 160 patients frozen in specialised tanks of liquid nitrogen at its headquarters.
Mr Kowalski said that when the first patients are reanimated depends on the rate at which modern medicine improves.
‘It depends on how much technology like stem-cells advances,’ he said.
Cryonics, also known as cryogenics and cryopreservation, is the art of freezing a dead body or body parts in order to preserve them.
Dennis Kowalski (pictured), president of the US-based Cryonics Institute – an organisation fronting the human freezing process – has now claimed scientists could reanimate one of these corpses within the next ten years
CRYONICS: THE FACTS
WHAT IS CRYONICS?
The deep freezing of a body to -196°C (-321°F).
Anti-freeze compounds are injected into the corpse to stop cells being damaged.
The hope is that medical science will advance enough to bring the patient back to life.
Two main US organisations carry out cryonics in the US: Alcor, in Arizona, and the Cryonics Institute, in Michigan.
Russian firm KrioRus is one of two facilities outside the US to offer the service, alongside Alcor’s European laboratory in Portugal.
HOW IS IT MEANT TO WORK?
The process can only take place once the body has been declared legally dead.
Ideally, it begins within two minutes of the heart stopping and no more than 15.
The body must be packed in ice and injected with chemicals to reduce blood clotting.
At the cryonics facility, it is cooled to just above 0°C and the blood is replaced with a solution to preserve organs.
Cryonpreservation is the deep freezing of a body to – 196°C (-321°F). Anti-freeze compounds are injected into the corpse to stop cells being damaged
The body is injected with another solution to stop ice crystals forming in organs and tissues, then cooled to -130°C.
The final step is to place the body into a container which is lowered into a tank of liquid nitrogen at -196°C.
WHAT’S THE CHANCE OF SUCCESS?
Many experts say there is none.
Organs such as the heart and kidneys have never been successfully frozen and thawed.
It is even less likely a whole body, and the brain, could be without irreversible damage.
HOW MUCH DOES IT COST?
Charges at the Cryonics Institute start at around £28,000 ($35,000) to ‘members’ for whole-body cryopreservation.
Rival group Alcor charges £161,000 ($200,000) while KrioRus’ procedure will set you back £29,200 ($37,600).
HOW LONG BEFORE PEOPLE CAN BE BROUGHT BACK TO LIFE?
Cryonics organisations claim it could be decades or even centuries.
However, medical experts say once cells are damaged during freezing and turned to ‘mush’ they cannot be converted back to living tissue, any more than you can turn a scrambled egg back into a raw egg.
Advocates see it as a miracle procedure to cheat death, with the hope that they will be revived once medical science has progressed far enough to cure whatever killed them.
Currently, it is only legal to freeze someone when they have just been declared dead.
The freezing process must begin as soon as the patient dies in order to prevent brain damage, with facilities currently available in Russia, the US and Portugal.
In the procedure, the body is cooled in an ice bath to gradually reduce its temperature bit by bit.
Experts then drain the blood and replace it with an anti freeze fluid to stop harmful ice crystals forming in the body.
Does Life In 2018 Live Up To What We Predicted A Century Ago?
People in the early 20th century were hopeful about the future innovation might bring. The technology that came out of World War I, and the growing potential brought by electricity (half of all U.S. homes had electric power by 1925) had many looking ahead to the coming century. Futurists of the early 1900s predicted an incredible boom in technology that would transform human lives for the better.
In fact, many of those predictions for the future in which we live weren’t far off, from the proliferation of automobiles and airplanes to the widespread transmission of information. Of course, the specifics of how those devices would work sometimes fell broad of the mark. Yet these predictions show us just how much our technology has progressed in just a century — and just how much further more innovation could take us.
Calling the Future
On a cool February day in 1917, storied inventor Alexander Graham Bell gave the graduating class of McKinley Manual Training School a rousing speech that would later sound a bit like prophecy.
“Now, it is very interesting and instructive to look back over the various changes that have occurred and trace the evolution of the present from the past,” Bell said, after recalling the incredible transformation wrought by electricity and automobiles alone. “By projecting these lines of advance into the future, you can forecast the future, to a certain extent, and recognize some of the fields of usefulness that are opening up for you.”
In 1876, Bell himself had patented the device known as the telephone, which used wires to transmit the sound of human speech. As this device spread, its capabilities allowed voices to cross enormous distances. In 1915, one such “wireless telephony” system had allowed a Virginia man to speak to another in Paris while a man in Honolulu listened in — a distance of 4,900 miles (about 7,886 kilometers), setting the record for the longest distance communication at that time.
Bell marveled at this achievement and the change it had already created, predicting that “this achievement surely foreshadows the time when we may be able to talk with a man in any part of the world by telephone and without wires.” At the time of Bell’s speech, the U.S. had an estimated 11.7 million working telephones; by the year 2000, that number had risen to nearly 103 million.
Extrapolating forward, Bell predicted a future in which this technology allowed people to pretty much anything remotely: “We shall probably be able to perform at a distance by wireless almost any mechanical operation that can be done at hand,” he said. And he wasn’t wrong.
Transportation of the Future
People a century ago were obsessed with the travel of the future. By 1914, the Ford Motor Company had developed the first moving assembly line, allowing the company to produce 300,000 cars in a single year. With transit beginning to transform society, futurists began imagining a world in which every person from Miami to Moscow could own their very own automobile. In that regard, they weren’t too far off — 95 percent of American households own cars, according to a 2016 government report. But those imagined automobiles looked a bit different from the ones we know today.
On January 6, 1918, the headline of an article in The Washington Times announced that the “Automobile of Tomorrow Will Be Constructed Like a Moving Drawing Room.” The author was writing about a prediction in Scientific American that described the car of the future. It would be water-tight and weather-proof, with sides made entirely of glass, and seats that could be moved anywhere in the vehicle. It would be decked out with power steering, brakes, heating, and a small control board for navigation. A finger lever would replace the steering wheel. Other designs imagined that cars would roll around on just three wheels, or on air-filled spheres to remove the need for shocks.
Future-forecasters of the early 1900s were enthralled by the idea that our everyday travel would not be confined to land. Take, for example, the series of postcards produced between 1899 and 1910 by French artist Jean-Marc Côté and his collaborators, who seemed confident that by the year 2000, we would have already colonized both sky and sea — and recruited some of their residents for our transit purposes.
Air travel was foremost in people’s minds: The Wright brothers made their first successful flight of a powered airplane in 1903, spurring other inventors and engineers to test innumerable aircraft designs before World War I. As such, it’s not surprising that Côté’s minute works imagined that, by the year 2000, nearly every form of transportation would be via air. Aerial taxi services, floating dirigible battleships, a flying postman, and air-based public transportation all appear in the whimsical depictions of our predicted current day.
Some craft, like an aerial rescue service or planes outfitted for warfare, are now an everyday part of military forces (though we don’t yet have the “French invisible aeroplane” that Scientific American promised was forthcoming in 1915).
Indeed, personal flying machines are a prominent feature of the 21st century as envisioned from the 19th and 20th — particularly the concept that personal flying cars would become commonplace. Forward-looking Victorians, such as artist Albert Robida in 1882, assumed the skies would be thick with flying cars by 2018.
In the May 1923 issue of Science and Invention, science fiction writer Hugo Gernsback described his vision for these flying cars, which he dubbed the “helicar,” as a solution to the automobile traffic he already saw jamming the streets of New York City:
The only practical solution is to combine the automobile with an airplane and this no doubt will happen during the next few decades. The Helicopter Automobile or, for short, the helicar, will not take up very much more room than the present large 7-passenger automobile, nor will it weigh much more than our present-day car, but instead of rolling down the avenue, you will go straight up in the air, and follow the air traffic lines, then descend at any place you wish.
We might not yet have a flying machine parked in every garage, but organizations such as Uber and NASA, the Russian defense company Kalashnikov, Toyota for the 2020 Olympics, and numerous smaller companies are developing personal flying cars, so this too may not be far off.
Alexander Graham Bell addressed the possibility of transportation by air, noting that travel by boat was cheaper than travel by rail, because no tracks had to be laid. Bell suggested that a “possible solution of the problem over land may lie in the development of aerial locomotion.” He continued: “However much money we may invest in the construction of huge aerial machines carrying many passengers, we don’t have to build a road,” — a sentiment echoed by one of his fictional successors.
Technology Gets Personal
In 1900, Smithsonian curator and writer John Elfrith Watkins, Jr., penned an article titled “What May Happen in the Next Hundred Years” for The Ladies’ Home Journal. Looking forward at the fresh new century, Watkins imagined a world in which technology wasn’t left in the hands of industry or the military — instead, it would be redirected to entertain and convenience everyday people.
Though he didn’t foresee television in its current form, Watkins predicted that technology would one day bring distant concerts and operas to private homes, sounding “as harmonious as though enjoyed from a theatre box,” and that “persons and things of all kinds will be brought within focus of cameras connected electrically with screens at opposite ends of circuits, thousands of miles at a span.” He also predicted that color photographs would one day be quickly transmitted around the world, and that “if there be a battle in China a hundred years hence snapshots of its most striking events will be published in the newspapers an hour later.” One can only guess what he would have thought of the selfie.
Watkins imagined that technology would transform our homes and diets. Though the mechanically-cooled refrigerator wasn’t invented until 1925, and wouldn’t become widely used until the 1940s, Watkins correctly predicted that “refrigerators will keep great quantities of food fresh for long intervals,” and that “fast-flying refrigerators on land and sea” would deliver fruits and vegetables from around the world to provide produce out-of-season. He even called the development of fast-food delivery, anticipating “ready-cooked meals… served hot or cold to private houses.” He believed these meal deliveries would replace home-cooking entirely (for some city-dwellers with Seamless accounts, that’s not too far off), and might arrive by pneumatic tubes as well as by “automobile wagons.”
Some of Watkins’ predictions might have been close to reality, but he was pretty far off about other aspects of life in the 21st century. He thought that man would have exterminated pests like roaches, mice, and mosquitoes, as well as all wild animals, which would “exist only in menageries.” This prediction was surprisingly common in the early 1900s, and might have been a reaction to then-recent extinctions like that of the quagga (1883), the passenger pigeon (1914), and the thylacine (1934). Though we are now going through another global extinction caused by human activity, we can be grateful that we haven’t quite reached the level of extinction most Victorian futurists expected.
Watkins also thought that we would have eliminated the letters C, X or Q in the everyday alphabet, as they were “unnecessary;” that humans would essentially make ourselves a into super-species, with physical education starting in the nursery, until “a man or woman unable to walk ten miles at a stretch will be regarded as a weakling.” Unfortunately, our global obesity problem shows the reality was, in fact, quite the opposite.
Thematically, though, these predictions are sound: As the use of electricity spread, and technology like automobiles and telephones became more affordable to use, Watkins could envision an age in which technology was entirely integrated into our lives. To futurists of the early 1900s, it seemed obvious that robots and automation would be essential to 21st century people, serving as our chauffeurs, cleaning the house, scheduling the laundry, and even electrically transmitting handshakes.
Alexander Graham Bell also predicted this trend, and he thought it heralded something particularly promising for the McKinley graduates he addressed in 1918. Foreseeing the rise of an industry centered around technology and an exploding need for scientists and engineers, he told them: “It is safe to say that scientific men and technical experts are destined in the future to occupy distinguished and honorable positions in all the countries of the world. Your future is assured.”
A Future of Clean Energy
Perhaps the most surprising predictions from the past century regard fossil fuels and the environment. Yes, today some people still resist transitioning away from fossil fuels and ignore the scientific consensus on climate change. But bright minds of the early 20th century were already theorizing that we would one day have to quit our fossil fuel habit.
As early as 1896, scientist Svante Arrhenius calculated that doubling the concentration of carbon dioxide in the atmosphere would raise Earth’s temperature between 8 and 9 degrees Celsius. Arrhenius was inspired by the startling discovery of his friend Arvid Högbom, who realized that human activities were releasing carbon dioxide at roughly the same rate as natural processes. Because of the rate at which industrial countries burned coal in 1896, Arrhenius believed human-caused warming wouldn’t reach problematic levels for thousands of years. But by the time he published his 1908 book Worlds in the Making, an attempt to explain the evolution of the universe to a popular audience, that rate had increased so much that Arrhenius was convinced that the amount of carbon dioxide in the atmosphere could double within a few centuries.
Scientists as a whole wouldn’t come around to Arrhenius’ ideas, or recognize that burning carbon-based fuels had an adverse effect on our planet, for at least a century. Yet even before scientists understood the climate effects of fossil fuels, futurists were predicting that we would have to drop our use of coal and oil before long. “Coal and oil are going up [in usage] and are strictly limited in quantity,” Alexander Graham Bell said in his February 1917 speech. He continued:
We can take coal out of a mine, but we can never put it back. We can draw oil from subterranean reservoirs, but we can never refill them again. We are spendthrifts in the matter of fuel and are using our capital for our running expenses. In relation to coal and oil, the world’s annual consumption has become so enormous that we are now actually within measurable distance of the end of the supply. What shall we do when we have no more coal or oil!
He went on to note that hydropower was, at the time, limited, and implied that one day it might be possible to generate energy from the tides or waves, or “the employment of the sun’s rays directly as a source of power.”
Bell wasn’t the only one who was sure we would have to find a new source of energy in the next century. In 1917, when a severe coal shortage in the U.S. caused people to call for the resource’s conservation, one writer for the Chicago News asserted that stockpiling coal would ultimately be foolish. He insisted that worrying about the supply of coal would soon be like fretting over the supply of tallow candles: pointless.
“These gifted lunatics who are worrying about the coal supply are in the same class,” the Chicago News writer insisted. “It doesn’t occur to them that in a hundred years people will be saying, ‘Our grandfathers, the poor boobs, actually used coal for heating purposes!’”
We’re not laughing quite yet. According to the U.S. Energy Information Administration (EIA), the U.S. still gets 17 percent of its energy from coal. Another 28 percent comes from petroleum products, and 33 percent from natural gas; we get only 12 percent of our electricity from the renewable sources that the Chicago News writer — who was sure we’d find a way “to put the sun’s energy in storage, and pump it into people’s houses thru pipes” — predicted by now. Globally, coal makes up about 27 percent of the world’s energy production, and renewable energy about 24 percent.
The good news is that this distribution is changing as renewable energy becomes cheaper than fossil fuels, edging us ever closer to the bright future that 20th century minds thought we’d be living in. Fingers crossed the whale-bus will be next.
Goodbye, Burger King! Horse Meat In Burgers Confirmed
As Gov’t Frames ET’s as Threat, Many Use Higher States of Consciousness to Contact Them
Antarctica Castle Discovery Rewrites History
Repeated Mothman sightings in Chicago
Free Yourself From Depression Forever By Applying These 10 Basic Principles
Time Travel, how it really works, explained so a newbie can understand
2018 is going to be the year when, for the first time, we’ll observe a black hole
Oumuamua First alien object to visit our solar system is wrapped in strange organic coat
How Would You Feel? 20 Shocking Illustrations Reversing Animals With Humans
CDC Prepares Public For Nuclear War
Time Travel, how it really works, explained so a newbie can understand
The Moon is Not What You Think it is – What Ancient Human Civilizations Said About The Moon
The Most Detailed Map Of The Universe Is Here
Mysterious Freemasons and NASA
- Spirituality4 years ago
Top 10 Spiritual Truths We Weren’t Taught in School
- Conspiracy Theories4 years ago
Top 10 Things You Shouldn’t Know About The Ubiquitous “Illuminati”
- Aliens & UFO's5 years ago
Alien Species: Advanced Humans, Greys and Reptilians
- Planet Earth5 years ago
Lost Tribe On Small Island In The Indian Ocean remain virtually untouched by modern civilization
- Aliens & UFO's5 years ago
200 Meter Cigar UFO Enters Volcano Near Mexico City
- Occult5 years ago
“America” The Land Of Lucifer; The Real Story Behind The Name
- Conspiracy Theories4 years ago
The Great Cancer Hoax: The Brilliant Cure the FDA Tried Their Best to Shut Down…
- Bizzare & Odd4 years ago
Sokushinbutsu – The Bizarre Practice of Self Mummification