Feeds:
Posts
Comments

Archive for May, 2011

Post by Minna Krejci

A few weeks ago, Discover’s Science Not Fiction blog explored the hidden message in Pixar’s films:

“The message hidden inside Pixar’s magnificent films is this: humanity does not have a monopoly on personhood. In whatever form non- or super-human intelligence takes, it will need brave souls on both sides to defend what is right. If we can live up to this burden, humanity and the world we live in will be better for it.” -Kyle Munkittrick on Science Not Fiction

One of the Pixar examples given as evidence was The Incredibles, which shows how human enhancement to beyond the human norm can lead to revulsion and alienation reactions.  The lesson, according to Munkittrick: “…human enhancement does not make you inhuman – the choices you make and the way you treat others determines how human you really are.”

We’ve always been interested in ways to improve our minds, bodies, or abilities.  But what happens as new technologies increasingly allow us to push the limits of our abilities to beyond what is “normal” for our species?  Do we limit human enhancement for fear of “enhanced” individuals acquiring an unfair advantage (in work, school, politics, athetics, etc.)?  Do we avoid regulation to retain our personal freedoms and rights to improve our own minds, bodies, and lives?

In a report funded by the National Science Foundation, the Human Enhancement Ethics Group discussed these kinds of issues in the form of 25 questions and answers regarding the ethics of human enhancement.  I recommend taking a look — it’s an interesting and relevant read, considering that we are already seeing these kinds of debates with respect to cognitive-enhancing and performance-enhancing drugs.  Is it ok for students diagnosed with ADHD to take stimulants to correct the “attention deficit,” but not ok for otherwise-normal students to take stimulants to help them focus better when studying for exams?  Where do you draw the line between what supplements/drugs athletes can and can’t take to improve their performance?

It sounds like we’ve got a lot of “why does he get one and I don’t” and “why can’t I use it just because she doesn’t have one” to look forward to…

Read Full Post »

Post by Henderson

Engineering mixes many disciplines, from mathematics to art to economics, to respond to the needs of growing societies.  As populations grow, needs of infrastructures change, and new ideas bring forth new challenges, engineers work in large or small teams to find solutions to these problems.

The idea of human engineering starts with the idea that the body is a machine.  A machine that can be understood, repaired, and if need-be, parts replaced.  This is not a new idea and has existed in greater or smaller ways since Rene Descartes wrote about the mind-body duality.  Separating the body into an automatic as a machine in his description of the human body.  But what Descartes contemporaries did not have were the tools needed to understand how the body works.

The human body is an efficient collection of complex systems.  As an example, something as simple as taking a walk requires the coordination of, at least, the skeletal, muscular, nervous, and cardiovascular systems.

When something goes awry in those systems, when the body is not responding the way it should given normal conditions, changes can be made to improve the way it works.  Sometimes this improvement can be made by physical training or changes to the diet of the individual.  But failing these, more invasive methods are employed to correct the problem.

Today’s engineers are faced with more than the idea that the body is a machine.  They are faced with a growing body of knowledge that gives them the tools to transplant hearts, implant electrodes into the brain, and even manipulate the genome to create favorable outcomes.

One of the biggest stories to hit the news in the last few years are surprisingly small.

The J. Craig Venter Institute announced last year the creation of a synthetic and self-replicating bacterial cell.  The synthetic cell is called Mycoplasma mycoides JCVI-syn1.0 and is the proof of principle that genomes can be designed in the computer, chemically made in the laboratory and transplanted into a recipient cell to produce a new self-replicating cell controlled only by the synthetic genome.

In this example, Venter’s lab has proven that we can mechanize the process of problem-solving on a cellular level.  Their work is adding to the public sphere a body of knowledge that will enhance the understanding of basic chemical and biological concepts and be integral to the production of new vaccines and medicines, amongst other things.

If there is a promise that could come from advances such as this, it is that we can treat the body and its most basic properties in a machine-like way.  In the end making it possible to provide basic research that enhances the human experience.

Read Full Post »

Posted by Henderson

Here’s a riddle:

It’s present everywhere, but occupies no space.

We can measure it, but we can’t see it, touch it,  get  rid  of  it,  or  put it  in  a container.

Everyone knows what it is and uses it every day, but no one has been able to define it.

We can spend it, save it, waste it, or kill it,  but  we  can’t  destroy  it  or  even change it, and there’s never any more or less of it

What is it?  Time.

Today, most of us depend less on the cycles of the moon than we do on clocks.  Digital displays help us keep track of everything from how long a turkey has been in the oven to what time to be at work.  But with all of the advances in the study of time over the last century, we still don’t know much more about it than did our early ancestors.

But we are getting a little closer to that understanding.

Research published by the National Institutes of Standards and Technology shows that there is a link between time and gravity.  In a test of Einstein’s theory of relativity, researchers using two super accurate optical clocks placed one above the other and proved that the clock on the bottom ticked nanoseconds slower than the one above it.

The clocks are based on the oscillations of a single aluminum ion that vibrates between two energy levels a million billion times per second. One clock is accurate to within one second in about 3.7 billion years, and the other is almost as accurate, NIST says.

The clock on the bottom ticked slower, basically, because it is closer to the Earth’s gravitational field and thus “felt” more gravity.

But what are some applications of this knowledge?  Any ideas?

Read Full Post »

Post by Ruthanna Gordon

Perhaps the most notable thing about time is that we notice it.  This may seem unimpressive at first, but think about it.  We don’t notice most of what happens in the universe.  Infrared, ultraviolet, and microwave radiation pass our eyes without a blip.  Magnetic fields pulse and fade.  But time is something that we perceive—without even having a sense set aside for it.

The most studied area in the psychology of time is our perception of the past.  Human  memory is complex and surprisingly fickle.  If you’ve ever gotten into a He Said/She Said argument, you know that people recall events in ways that make them look good, even at the cost of accuracy.  And when we’re very stressed—for example, just after witnessing a crime—we unthinkingly fill in memory gaps with whatever other people claim to have seen.

Even with their flaws, our memories do a lot for us.  Memory helps you keep track of who and what you are.  It helps you keep track of other people’s personalities, so that you learn who and when to trust.  It lets you improve your handling of a situation the second, third, and three thousandth time you encounter it.

The psychology of the future seems like it should be a fuzzier thing.  When we plan and daydream, we imagine many different possibilities.  We consider different ways of asking for a raise tomorrow, or fantasize about remote chances years away.  But it turns out that memory and planning are, in many ways, really the same thing.

Over the last several years, psychologists have begun to examine future thinking in detail.  What they’ve learned is that it draws heavily on memory.  Want to know what the future will be like?  In many cases, it will be like the past.  When it’s not, past patterns may still be useful in guessing at the differences.  Planning your next party lights up the same brain areas that spark when you reminisce about your last one.  These abilities, more and more frequently, are grouped together as “mental time travel.”

In the mind, the past and the future intertwine more closely than anyone guessed.  People with amnesia often have trouble creating a detailed image of the future.  People who give vague answers (“I always do badly at math”) when asked about specific events (“Tell me about the last time you took a test”) are prone to equally vague fears about the future, and to depression.  And some skills—a vivid imagination, social modeling, and mental mapping—extend our reach beyond the present and allow it to flourish in both directions.

Read Full Post »

Post by Minna Krejci

Quiz time: What do Donnie Darko, Star Trek, A Wrinkle in Time, Contact, and Stargate have in common?

I’ll give you a hint:

"Wormhole" painting by the artist Andrew Leipzig (http://blog.onlineclock.net/wormholes-as-time-machines/).

…WORMHOLES!

The concept of the wormhole is popular in science fiction — wormholes make it possible to travel across the universe within the span of a human lifetime.  (To put things into perspective: if we traveled at the fastest speed recorded by a manmade object — set by the Helios 2 spacecraft at 150,000 miles/hr — it would take us 19,000 years to get to the closest star to our solar system, Proxima Centauri, 4.22 light years from Earth.)

Wormholes provide a shortcut through space and time.  While wormholes have never actually been observed outside of science fiction, the theory of general relativity allows for the existence of these kinds of structures.  We can visualize “spacetime” as a 2D sheet bent back on itself, with a wormhole “bridge” connecting two distant regions:

Recent calculations suggest that advanced civilizations might be able to make wormholes work by using something called “exotic matter,” which has a negative energy, to prevent a wormhole from collapsing on itself.  If such “traversable” wormholes exist, then they hypothetically could allow for time travel.  (Just hypothetically, at least for now.)

Read Full Post »

Famed professor Brian Cox explains time…

Read Full Post »

Post by Ruthanna Gordon

On-line and interactive is an interesting way to learn about science. But it can also be a useful—if uncomfortable—way to do science.

When most people think about science, they think about lab work: setting up equipment, running experiments, and gathering data. But science is also what happens afterward. Once data is collected and written up, the research usually undergoes peer review. This is not a complicated process. Journal editors send an article out to other researchers (usually 3), and ask them to comment on its merits. Reviewers can ask for further explanation, or additional studies to rule out competing explanations for the findings. Sometimes they decide that the study wasn’t sufficiently well-done to share with the wider public at all; more often they demand changes that make for stronger, if somewhat delayed, published work.

The Peer Review Process

Many researchers have questioned the peer review process. Reviewers may be biased, positively or negatively. They may miss problems because they are caught up in the excitement of an interesting finding, because they are distracted by their own studies, or because they are fitting the review into 37 free minutes at 2 AM. As Winston Churchill said of democratic government, it’s the worst possible system, except for all the others we’ve tried. But the collaborative hothouse of the internet opens up new possibilities.

These possibilities were highlighted late last year, when NASA-funded scientist Felisa Wolfe-Simon announced her discovery of arsenic-based life in a California lake. This work had undergone peer review and been published in one of the world’s most prestigious journals, then brought to public attention amid intense hype. Wolfe-Simon and her colleagues were somewhat startled to find their work subjected to an informal, and often snide, supplement to the original review—but with dozens of well-informed reviewers rather than a handful.

Arsenic-Rich Mono Lake, Source of the Controversial Bacteria

There is an excellent overview here, but in brief: several scientists criticized Wolfe-Simon’s methods and measurements, questioning her conclusions. She, and NASA, responded by suggesting that the peer review process was not only important, but the only legitimate venue for scientific critique. They dismissed the input from blogs and Twitter—on the basis that it occurred on blogs and Twitter. And they used the journal’s prestige as a defense against the very real questions of fact raised by their critics.

There’s always some tension between science and scientists. Science works by constantly seeking evidence against claims, accepting only those which are supported by the observed state of the world. Technically speaking, every experiment should be a whole-hearted attempt to prove that one is wrong—because one can only be sure of being right when that disproof fails. Scientists, however, depend on rightness for their livelihood and reputation. If you successfully disprove all your hypotheses for several years, universities and grant-givers consider you a failure. Furthermore, scientists are human, and we like to be right.

The peer review process is intended as a counter to these human tendencies. It is not the only one possible: any expansion of informed debate and criticism is good for science. That the criticism is informed—that it comes from people who understand and are involved with the field in question—is important. That it come from a traditional venue is not. Online forums provide a rich environment for discussion, facilitating a more collaborative and extended critique than was previously possible.

Some scientists take deliberate advantage of 2.0 review. A few journals print any paper that appears to have valid methods, with review as an ongoing and public process. Other sites are devoted to “post-publication peer review.” Although these methods have their weak points, they have the potential to fill some of the gaps in the more traditional system. And as these innovations become more familiar, one hopes that more researchers will welcome them—and that their research will become stronger as a result.

Read Full Post »

Older Posts »

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: