philosophy (11)

Eclipse...

12423654073?profile=RESIZE_710x

Topics: Astronomy, Astrophysics, Philosophy, Planetary Science, Space Exploration

There will be a partial eclipse here in Greensboro. I purchased these glasses (six pairs) in 2017 for ANOTHER partial eclipse that I missed due to working in the lab during my first year in grad school. Nano took precedence over Astro. According to Time and Date dot com, the current show starts around 1:56 p.m. and ends around 4:28 p.m.

I will look particularly at my Texas box turtle, "Speedy," to see how she reacts when the show starts. Animals tend to go into their shelters (which she has a faux log she likes to go under) during the eclipse because it looks like night. She's far more accurate than Punxsutawney Phil. I've never fully understood the legend and lore, but like many practices that make the world scratch its heads, this is a "thing" in America.

Oh, and its not a sign of the Apocalypse. Solar and lunar eclipses are natural occurrences that, unfortunately, superstition has promoted for whatever reason to disastrous results. It comes with the territory of having a moon. Venus, an oven that would make Hell blush, as far as we know, doesn't have a moon, and if we were to colonize Mars, it has two.

12423655679?profile=RESIZE_710x

(Image: (c) Alan Dyer/VW Pics/UIG Getty Image)

On April 8, 2024, a total solar eclipse will be visible across North America. 

Our total eclipse 2024 guide tells you everything you need to know about the phenomenon from where to see it it to why it's so special. If you can't catch the eclipse in person you can watch the total solar eclipse live here on Space.com

During a total eclipse, the moon appears almost exactly the same size as the sun and blocks the entire disk for a few minutes — known as totality. 

The 115-mile-wide (185 kilometers) path of totality will cross three states in Mexico, 15 U.S. states and four states in southeast Canada.

A total solar eclipse is coming to North America. Daisy Dobrijevic, Contributions from Brett Tingley, Space.com

Read more…

Boltwood Estimate...

12365551887?profile=RESIZE_710x

Credit: Public Domain

Topics: Applied Physics, Education, History, Materials Science, Philosophy, Radiation, Research

We take for granted that Earth is very old, almost incomprehensibly so. But for much of human history, estimates of Earth’s age were scattershot at best. In February 1907, a chemist named Bertram Boltwood published a paper in the American Journal of Science detailing a novel method of dating rocks that would radically change these estimates. In mineral samples gathered from around the globe, he compared lead and uranium levels to determine the minerals’ ages. One was a bombshell: A sample of the mineral thorianite from Sri Lanka (known in Boltwood’s day as Ceylon) yielded an age of 2.2 billion years, suggesting that Earth must be at least that old as well. While Boltwood was off by more than 2 billion years (Earth is now estimated to be about 4.5 billion years old), his method undergirds one of today’s best-known radiometric dating techniques.

In the Christian world, Biblical cosmology placed Earth’s age at around 6,000 years, but fossil and geology discoveries began to upend this idea in the 1700s. In 1862, physicist William Thomson, better known as Lord Kelvin, used Earth’s supposed rate of cooling and the assumption that it had started out hot and molten to estimate that it had formed between 20 and 400 million years ago. He later whittled that down to 20-40 million years, an estimate that rankled Charles Darwin and other “natural philosophers” who believed life’s evolutionary history must be much longer. “Many philosophers are not yet willing to admit that we know enough of the constitution of the universe and of the interior of our globe to speculate with safety on its past duration,” Darwin wrote. Geologists also saw this timeframe as much too short to have shaped Earth’s many layers.

Lord Kelvin and other physicists continued studies of Earth’s heat, but a new concept — radioactivity — was about to topple these pursuits. In the 1890s, Henri Becquerel discovered radioactivity, and the Curies discovered the radioactive elements radium and polonium. Still, wrote physicist Alois F. Kovarik in a 1929 biographical sketch of Boltwood, “Radioactivity at that time was not a science as yet, but merely represented a collection of new facts which showed only little connection with each other.”

February 1907: Bertram Boltwood Estimates Earth is at Least 2.2 Billion Years Old, Tess Joosse, American Physical Society

Read more…

Brookhaven and Fake News...

12364614287?profile=RESIZE_710x

Climate of fear Anti-science protestors led to the closure of the High Flux Beam Reactor at the Brookhaven National Laboratory in the US 25 years ago using tactics that are widespread today. (Courtesy: iStock/DanielVilleneuve)

Topics: Biology, Cancer, Carl Sagan, Civilization, Climate Change, Philosophy, Physics

I typically don't comment on articles, but this one resonated with my memories of Carl Sagan desperately trying to raise the critical thinking skills of an entire essential nation with "The Demon-Haunted World: Science as a Candle in the Dark." The host of Cosmos would succumb to pneumonia as a consequence of bone marrow disease. I will be the age Carl was when he passed away this year, 62, but not as accomplished as he did in the six decades we all had access to him.

The framework of our current duress was already here in the form of celebrity worship, gossip columns, and talk shows where sensationalism equaled eyeballs, just as the Internet rouses the primitive lizard portion of our brains to be afraid, get angry, and "buy-purchase-consume" products (a friend who's a sound engineer likes to say that a lot).

Underhand tactics by environmental activists led to the closure of a famous physics facility 25 years ago. We can still learn much from the incident, says Robert P Crease.

Fake facts, conspiracy theories, nuclear fear, science denial, baseless charges of corruption, and the shouting down of reputable health officials. All these things happened 25 years ago, long before the days of social media, in a bipartisan, celebrity-driven episode of science denial.  Yet the story offers valuable lessons for what works and what does not (mostly the latter) for anyone wanting to head off such incidents.

The episode in question concerned one of the more valuable scientific facilities in the US, the High Flux Beam Reactor (HFBR) at the Brookhaven National Laboratory. As I mentioned in a previous column and in my book The Leak, the HFBR was a successful research instrument that was used to make medical isotopes and study everything from superconductors to proteins and metals. “Experimentalists saw the reactor as the place to go,” recalls the physicist William Magwood IV, then at the US Department of Energy.

But in 1997, lab scientists discovered a leak of water from a pool located in the same building as the reactor, where its spent fuel was stored. The leak contained tritium, a radioactive isotope of hydrogen that decays with a half-life of about 12 years, releasing low-energy electrons that can be stopped by a few sheets of paper. The total amount of tritium in the leak was about that in typical self-illuminating “EXIT” signs.

The protestors’ tactics are a familiar part of today’s political environment: tell people they are in danger and insist that anyone who says otherwise is lying.

The article goes on to recount the actor Alec Baldwin using his celebrity to put a ten-year-old child on the Montell Williams Show to claim that the tritium and the research facility caused his cancer. It wasn't true, but it was LOUD, drowning out the experts who are used to spirited peer review and erudite discussions of research, not tears and gnashing of teeth.

Montell Williams ended his talk show after announcing that he had multiple sclerosis. Alec Baldwin, though I enjoyed his SNL skits, has other pressing issues.

I have a physicist friend who's using tritium in his research with optical tweezers, separating isotopes to detect and treat cancers, among other applications. I am opting not to give his website as those same elements described in the article about Brookhaven National Labs have metastasized into our current societal mass psychosis. If his research leads to your cancer cure, you can thank him later.

Twenty-five years ago, we weren't as far along in climate disruption as we are now. Twenty-five years ago, CNN was 19 years old, and its clones, Fox and MSNBC, were 3 years old. Five years after the Y2K scare (exquisitely setting us up for election 2000 and 9/11), humanity further siloed itself into warring tribes, first posting on Internet bulletin boards, MySpace. Then, the logical progression was to Facebook, Twitter (now X), and its myriad progeny.

A side note: CERN would go on to discover the Higgs Boson because we, in the spirit of fiscal stewardship, closed the superconducting collider in Waxahachie, Texas, 48 kilometers south of Dallas. Peter Higgs and François Englert owe their 2013 Physics Nobel Prize to Switzerland. U-S-A. U-S-A.

How much further along in cancer research and nuclear energy as an alternative to fossil fuels would we be if, prior to Facebook and the former Twitter, we exercised a little critical thinking and common sense? I'm not talking about tritium, but fission reactors, which we know how to build (fusion, though cleaner and less radioactive, is still far off), but the environmental activists have terrorized anyone from building newer and safer facilities that might have had some positive impact on our warming climate. To paraphrase a famous saying, "Don't let the perfect be the enemy of the good." Our air quality improved during the pandemic, so the logic leads to upgrading public transportation to something matching other countries that rely on it more than we do, or within our borders, the subway systems in New York, New Jersey, Philadelphia, or Washington, DC. You end up doing nothing of any importance. We could replace the fission reactors one by one as fusion comes online.

That is what enrages and disappoints me.

The American reactor that was closed by fake news, Robert P Crease, Physics World

Read more…

The Wine of Consciousness...

12281976271?profile=RESIZE_584x

Credit: Fanatic Studio/Gary Waters/Getty Images

Topics: Education, Existentialism, Philosophy, Physics

Physicists and philosophers recently met to debate a theory of consciousness called panpsychism.

More than 400 years ago, Galileo showed that many everyday phenomena—such as a ball rolling down an incline or a chandelier gently swinging from a church ceiling—obey precise mathematical laws. For this insight, he is often hailed as the founder of modern science. But, Galileo recognized that not everything was amenable to a quantitative approach. Such things as colors, tastes, and smells “are no more than mere names,” Galileo declared, for “they reside only in consciousness.” These qualities aren’t really out there in the world, he asserted, but exist only in the minds of creatures that perceive them. “Hence, if the living creature were removed,” he wrote, “all these qualities would be wiped away and annihilated.”

Since Galileo’s time, the physical sciences have leaped forward, explaining the workings of the tiniest quarks to the largest galaxy clusters. But explaining things that reside “only in consciousness”—the red of a sunset, say, or the bitter taste of a lemon—has proven far more difficult. Neuroscientists have identified a number of neural correlates of consciousness—brain states associated with specific mental states—but have not explained how matter forms minds in the first place. As philosopher Colin McGinn put it in a 1989 paper, “Somehow, we feel, the water of the physical brain is turned into the wine of consciousness.” Philosopher David Chalmers famously dubbed this quandary the “hard problem” of consciousness.*

Scholars recently gathered to debate the problem at Marist College in Poughkeepsie, N.Y., during a two-day workshop focused on an idea known as panpsychism. The concept proposes that consciousness is a fundamental aspect of reality, like mass or electrical charge. The idea goes back to antiquity—Plato took it seriously—and has had some prominent supporters over the years, including psychologist William James and philosopher and mathematician Bertrand Russell. Lately, it is seeing renewed interest, especially following the 2019 publication of philosopher Philip Goff’s book Galileo’s Error, which argues forcefully for the idea.

Is Consciousness Part of the Fabric of the Universe? Dan Falk, Scientific American

Read more…

The Decline of Disruptive Science…

10928263052?profile=RESIZE_584x

The proportion of disruptive scientific papers, such as the 1953 description of DNA’s double-helix structure, has fallen since the mid-1940s.Credit: Lawrence Lawry/SPL

Topics: DNA, Education, Philosophy, Research, Science, STEM

The number of science and technology research papers published has skyrocketed over the past few decades — but the ‘disruptiveness’ of those papers has dropped, according to an analysis of how radically papers depart from the previous literature1.

Data from millions of manuscripts show that, compared with the mid-twentieth century, research done in the 2000s was much more likely to incrementally push science forward than to veer off in a new direction and render previous work obsolete. Analysis of patents from 1976 to 2010 showed the same trend.

“The data suggest something is changing,” says Russell Funk, a sociologist at the University of Minnesota in Minneapolis and a co-author of the analysis published on 4 January in Nature. “You don’t have quite the same intensity of breakthrough discoveries you once had.”

Telltale citations

The authors reasoned that if a study were highly disruptive, subsequent research would be less likely to cite its references and instead cite the study itself. Using the citation data from 45 million manuscripts and 3.9 million patents, the researchers calculated a measure of disruptiveness called the ‘CD index,’ in which values ranged from –1 for the least disruptive work to 1 for the most disruptive.

The average CD index declined by more than 90% between 1945 and 2010 for research manuscripts (see ‘Disruptive science dwindles’) and more than 78% from 1980 to 2010 for patents. Disruptiveness declined in all analyzed research fields and patent types, even when factoring in potential differences in factors such as citation practices.

‘Disruptive’ science has declined — and no one knows why, Max Kozlov, Nature.

Read more…

Martians and Vulcans...

10634790870?profile=RESIZE_584x

(Credit: ktsdesign/Shutterstock)

Topics: Astrobiology, Astrophysics, Civilization, Existentialism, Philosophy, Special Relativity

The Cold War was a genesis of angst about the future due to the detonation of the atomic bomb by the Soviet Union in Kazakstan in 1949. After WWII (WWI was originally called, "the war to END all wars," until the sequel), the existential nervousness is understandable. Extraterrestrials, or musings about them, let humans off the hook if the Earth is rendered dystopic, and uninhabitable (with respect to "War of the Worlds" Martians), and some more advanced species to come to save us from our screw-ups (Star Trek Vulcans). Trek aliens that aren't that hospitable are the Gorn and Klingons. Neither of which I'd prefer to see on first contact. However, the vast distance between stars, relativistic speeds, and the drag of mass on even reaching a fraction of the speed of light make that possibility remote.

*****

In September 1961, Barney and Betty Hill were driving late at night in the mountains of New Hampshire when they saw a flying object whizzing in the sky. Barney thought it was a plane until he saw it swiftly switch directions.

According to The Interrupted Journey, the couple nervously continued driving until a spacecraft confronted them. They remembered seeing “humanoid-like” creatures and hearing pinging sounds reverberating off their car trunk. And then, they found themselves 35 miles further along on the highway with almost no memory of what had just transpired. They believed they had been abducted.

Scholars mark 1947 as the start of the UFO fascination. A pilot flying in the Cascade Mountains in Washington state reported seeing disc-shaped objects. In the next decade, aliens were primarily seen as benevolent, intelligent beings who came to Earth to offer advice or warnings.

In 1961, the Hills reported their abduction, and stories about aliens became more sinister. Social scientists, like famed psychologist Carl Jung, analyzed the UFO obsession and found it fit neatly with humans’ long fascination with heavenly ascents. Whereas past societies looked for angels, saints, or Gods to descend from the heavens, modern Americans were looking for “technological angels.”

Starting in the 1960s, aliens were both benign angels and menacing demons, which prompted some religious scholars to see UFO fixation as a modern religious movement.

Our Fascination With Aliens and When it All Started, Emilie Le Beau Lucchesi, Discover Magazine

Read more…

Moments and Metaphors...

10002870885?profile=RESIZE_584x

Credit: Pete Saloutos/Getty Images

Topics: Astronomy, Astrophysics, Comets, Philosophy, Science Fiction

On a recent morning, in Lower Manhattan, 20 scientists, including me, gathered for a private screening of the new film Don’t Look Up, followed by lunch with the film’s director, Adam McKay.

The film’s plot is simple. An astronomy graduate student, Kate Dibiasky (Jennifer Lawrence), and her professor, Randall Mindy (Leonardo DiCaprio), discover a new comet and realize that it will strike the Earth in six months. It is about nine kilometers across, like the one that wiped out the dinosaurs 66 million years ago. The astronomers try to alert the president, played by Meryl Streep, to their impending doom.

“Let’s just sit tight and assess,” she says, and an outrageous, but believable comedy ensues, in which the astronomers wrangle an article in a major newspaper and are mocked on morning TV, with one giddy host asking about aliens and hoping that the comet will kill his ex-spouse.

At last, mainstream Hollywood is taking on the gargantuan task of combatting the rampant denial of scientific research and facts. Funny, yet deadly serious, Don’t Look Up is one of the most important recent contributions to popularizing science. It has the appeal, through an all-star cast and wicked comedy, to reach audiences that have different or fewer experiences with science.

Don’t Look Up isn’t a movie about climate change, but one about planetary defense from errant rocks in space. It handles that real and serious issue effectively and accurately. The true power of this film, though, is in its ferocious, unrelenting lampooning of science deniers.

After the screening, in that basement theater in SoHo, McKay said: “This film is for you, the scientists. We want you to know that some of us do hear you and do want to help fight science denialism.”

Hollywood Can Take On Science Denial: Don’t Look Up Is a Great Example, Rebecca Oppenheimer, curator, and professor of astrophysics at the American Museum of Natural History/Scientific American

Read more…

Life As We Don't Know It...

9611745891?profile=RESIZE_710x

The depiction of tentacled extraterrestrials (above) in the recent science-fiction film, "Arrival, "indicates divergence from aliens reported by supposed eyewitness accounts. Paramount. Source: Wrinkles, tentacles and oval eyes: How depictions of aliens have evolved, CNN Style

Topics: Astrobiology, Philosophy, SETI, Space Exploration

In my freshman seminar at Harvard last semester, I mentioned that the nearest star to the sun, Proxima Centauri, emits mostly infrared radiation and has a planet, Proxima b, in the habitable zone around it. As a challenge to the students, I asked: “Suppose there are creatures crawling on the surface of Proxima b? What would their infrared-sensitive eyes look like?” The brightest student in class responded within seconds with an image of the mantis shrimp, which possesses infrared vision. The shrimp’s eyes look like two ping-pong balls connected with cords to its head. “It looks like an alien,” she whispered.

When trying to imagine something we’ve never seen, we often default to something we have seen. For that reason, in our search for extraterrestrial life, we are usually looking for life as we know it. But is there a path for expanding our imagination to life as we don’t know it?

In physics, an analogous path was already established a century ago and turned out to be successful in many contexts. It involves conducting laboratory experiments that reveal the underlying laws of physics, which in turn apply to the entire universe. For example, around the same time when the neutron was discovered in the laboratory of James Chadwick in 1932, Lev Landau suggested that there might be stars made of neutrons. Astronomers realized subsequently that there are, in fact, some 100 million neutron stars in our Milky Way galaxy alone—and a billion times more in the observable universe. Recently, the LIGO experiment detected gravitational wave signals from collisions between neutron stars at cosmological distances. It is now thought that such collisions produce the precious gold that is forged into wedding bands. The moral of this story is that physicists were able to imagine something new in the universe at large and search for it in the sky by following insights gained from laboratory experiments on Earth.

How to Search for Life as We Don't Know It, Avi Loeb, Scientific American

Read more…

More Alike Than Different...

 

Topics: Astrophysics, Atomic Physics, Cosmology, Philosophy

We are more alike than different. The atoms in our bodies are the same forged in distant stars; Carl Sagan said we are "made of star stuff."

Then: we evolve under ultraviolet light at degree inclinations on the globe, thereby changing the prominence of Melanin in our epidurals. Due to war and conquests, we craft a narrative of what is godly, who is "divine" and who is deviant. Good and evil has a hue or light and darkness. And thus, we craft the seeds of our own self-destruction from ignorance, hubris, racism, snobbery and xenophobia.

Star stuff should be better behaved.

Read more…

Current Time...

drake-equation-1600px.jpg
The Drake Equation from the SETI institute.

 

Topics: African Americans, Drake Equation, Existentialism, Extinction, Nanotechnology, Philosophy

Where:

N = The number of civilizations in the Milky Way Galaxy whose electromagnetic emissions are detectable.
R* = The rate of formation of stars suitable for the development of intelligent life.
fp = The fraction of those stars with planetary systems.
ne = The number of planets, per solar system, with an environment suitable for life.
fl = The fraction of suitable planets on which life actually appears.
fi = The fraction of life bearing planets on which intelligent life emerges.
fc = The fraction of civilizations that develop a technology that releases detectable signs of their existence into space.
L = The length of time such civilizations release detectable signals into space.

*****

Note: This milestone will be one month old Sunday. We shaved 20 seconds.

Closer than ever:
It is 100 seconds to midnight
2020 Doomsday Clock Statement

Science and Security Board
Bulletin of the Atomic Scientists

Editor, John Mecklin


Editor’s note: Founded in 1945 by University of Chicago scientists who had helped develop the first atomic weapons in the Manhattan Project, the Bulletin of the Atomic Scientists created the Doomsday Clock two years later, using the imagery of apocalypse (midnight) and the contemporary idiom of nuclear explosion (countdown to zero) to convey threats to humanity and the planet. The decision to move (or to leave in place) the minute hand of the Doomsday Clock is made every year by the Bulletin’s Science and Security Board in consultation with its Board of Sponsors, which includes 13 Nobel laureates. The Clock has become a universally recognized indicator of the world’s vulnerability to catastrophe from nuclear weapons, climate change, and disruptive technologies in other domains.

 

To: Leaders and citizens of the world
Re: Closer than ever: It is 100 seconds to midnight
Date: January 23, 2020


Humanity continues to face two simultaneous existential dangers—nuclear war and climate change—that are compounded by a threat multiplier, cyber-enabled information warfare, that undercuts society’s ability to respond. The international security situation is dire, not just because these threats exist, but because world leaders have allowed the international political infrastructure for managing them to erode.

In the nuclear realm, national leaders have ended or undermined several major arms control treaties and negotiations during the last year, creating an environment conducive to a renewed nuclear arms race, to the proliferation of nuclear weapons, and to lowered barriers to nuclear war. Political conflicts regarding nuclear programs in Iran and North Korea remain unresolved and are, if anything, worsening. US-Russia cooperation on arms control and disarmament is all but nonexistent.

Public awareness of the climate crisis grew over the course of 2019, largely because of mass protests by young people around the world. Just the same, governmental action on climate change still falls far short of meeting the challenge at hand. At UN climate meetings last year, national delegates made fine speeches but put forward few concrete plans to further limit the carbon dioxide emissions that are disrupting Earth’s climate. This limited political response came during a year when the effects of man-made climate change were manifested by one of the warmest years on record, extensive wildfires, and quicker-than-expected melting of glacial ice.

Continued corruption of the information ecosphere on which democracy and public decision making depend has heightened the nuclear and climate threats. In the last year, many governments used cyber-enabled disinformation campaigns to sow distrust in institutions and among nations, undermining domestic and international efforts to foster peace and protect the planet.

This situation—two major threats to human civilization, amplified by sophisticated, technology-propelled propaganda—would be serious enough if leaders around the world were focused on managing the danger and reducing the risk of catastrophe. Instead, over the last two years, we have seen influential leaders denigrate and discard the most effective methods for addressing complex threats—international agreements with strong verification regimes—in favor of their own narrow interests and domestic political gain. By undermining cooperative, science- and law-based approaches to managing the most urgent threats to humanity, these leaders have helped to create a situation that will, if unaddressed, lead to catastrophe, sooner rather than later.

 

*****


The full PDF version of the above is here. Facebook has finally released limited data for social scientists to research the effect of their platform on democracy, just as our senate blocks bills meant for protecting the voting franchise. State legislatures in Florida and Georgia make it difficult for ex-felons or people of color to vote - who needs Russians when shortsighted republicans will do? The confluence of avarice and racist hegemony may well spell the epitaph of our republic, species, and life on this planet. The 2020 elections may slow the Doomsday Clock, or speed us seconds closer.

In the Drake Equation, that even Dr. Frank Drake hedges bets against, the L: the length of time such civilizations release detectable signals into space, along with the fraction of planets where intelligent life emerges (I'm dubious about ours) are the most important variables in the equation, from a philosophical point of view.

It means to me: no more Ginai Seabron graduates, no nanoscience, nanoengineering or nanotechnology. No fretting about how to make the discipline inclusive, as surviving cavemen and women have other more pressing concerns. There cannot be advancement on such an aggressive act of mutually-assured destruction (M.A.D.). There are no "winners" or losers following such a destructive path, only un-buried corpses.

It means to me: if we survive our own avarice and hubris, my granddaughter can have a future not decided by "the color of her skin, but by the content of her character," and she could literally reach for the stars. Or, we could all be baited to Armageddon by a tweet. You can apparently get reduced sentences for your friends, despite DOJ guidelines. A Banana Republic in 140 characters. "Stop and frisk"; non-disclosure agreements for sexual harassment from the so-called benign (actual) billionaire candidate doesn't give me much hope. For my granddaughter's future, I'd like to have some.

We would theoretically and literally, then all be equalized to ashes. The universe would be indifferent to which pile of ash was a billionaire or pauper, so-called white, black or other; or a grandfather making his granddaughter laugh with a silly song about "little feet." Our self-induced inequality problems would be solved - for eternity.

The search for extraterrestrial intelligence would be over on our end, as earthbound intelligence, post-Apocalypse would then have been found...bereft.
Read more…

Adaption and Extinction...

MV5BMWQ4MzI2ZDQtYjk3MS00ODdjLTkwN2QtOTBjYzIwM2RmNzgyXkEyXkFqcGdeQXVyMTQxNzMzNDI%2540._V1_.jpg
Source: Internet Movie Database

 

Topics: Biology, Climate Change, Existentialism, Philosophy, Politics


Though the movie poster is an attempt at dark humor, I do agree with the science. We're in a time of our history where science is being suborned to political and economic considerations, when we need it literally for survival.


From a biological perspective, there is no such thing as devolution. All changes in the gene frequencies of populations--and quite often in the traits those genes influence--are by definition evolutionary changes. The notion that humans might regress or "devolve" presumes that there is a preferred hierarchy of structure and function--say, that legs with feet are better than legs with hooves or that breathing with lungs is better than breathing with gills. But for the organisms possessing those structures, each is a useful adaptation.

Chief among these misconceptions is that species evolve or change because they need to change to adapt to shifting environmental demands; biologists refer to this fallacy as teleology. In fact, more than 99 percent of all species that ever lived are extinct, so clearly there is no requirement that species always adapt successfully. As the fossil record demonstrates, extinction is a perfectly natural--and indeed quite common--response to changing environmental conditions. When species do evolve, it is not out of need but rather because their populations contain organisms with variants of traits that offer a reproductive advantage in a changing environment.

 

Is the human race evolving or devolving? July 20, 1998, Scientific American

Read more…