Friday, 31 August 2012

First Direct Observations of Quantum Effects in an Optomechanical System

While tractor beams of this sort remain science fiction, beams of light today are being used to mechanically manipulate atoms or tiny glass beads, with rapid progress being made to control increasingly larger objects. Those who see major roles for optomechanical systems in a host of future technologies will take heart in the latest results from a first-of-its-kind experiment A long-time staple of science fiction is the tractor beam, a technology in which light is used to move massive objects -- recall the tractor beam in the movie Star Wars that captured the Millennium Falcon and pulled it into the Death Star.

Department of Energy's Lawrence Berkeley National Laboratory (Berkeley Lab) and the University of California (UC) Berkeley, using a unique optical trapping system that provides ensembles of ultracold atoms, have recorded the first direct observations of distinctly quantum optical effects -- amplification and squeezing -- in an optomechanical system. Scientists with the U. Their findings point the way toward low-power quantum optical devices and enhanced detection of gravitational waves among other possibilities. "We've shown for the first time that the quantum fluctuations in a light field are responsible for driving the motions of objects much larger than an electron and could in principle drive the motion of really large objects," says Daniel Brooks, a scientist with Berkeley Lab's Materials Sciences Division and UC Berkeley's Physics Department.

Brooks, a member of Dan Stamper-Kurn's research group, is the corresponding author of a paper in the journal Nature describing this research. The paper is titled "Nonclassical light generated by quantum-noise-driven cavity optomechanics." Co-authors were Thierry Botter, Sydney Schreppler, Thomas Purdy, Nathan Brahms and Stamper-Kurn.

Light will build-up inside of an optical cavity at specific resonant frequencies, similar to how a held-down guitar string only vibrates to produce specific tones. Positioning a mechanical resonator inside the cavity changes the resonance frequency for light passing through, much as sliding one's fingers up and down a guitar string changes its vibrational tones. Meanwhile, as light passes through the optical cavity, it acts like a tiny tractor beam, pushing and pulling on the mechanical resonator.

This not only opens the door to fundamental studies of quantum mechanics that could tell us more about the "classical" world we humans inhabit, but also to quantum information processing, ultrasensitive force sensors, and other technologies that might seem like science fiction today. Changes to the light can provide control over that atomic motion. "There have been proposals to use optomechanical devices as transducers, for example coupling motion to both microwaves and optical frequency light, where one could convert photons from one frequency range to the other," Brooks says. Likewise, even the tiniest fluctuations in the light/vacuum can cause the atoms to wiggle. If an optical cavity is of ultrahigh quality and the mechanical resonator element within is atomic-sized and chilled to nearly absolute zero, the resulting cavity optomechanical system can be used to detect even the slightest mechanical motion. "There have also been proposals for slowing or storing light in the mechanical degrees of freedom, the equivalent of electromagnetically induced transparency or EIT, where a photon is stored within the internal degrees of freedom. "

Brooks, Stamper-Kurn and their colleagues were able to meet the challenge with their microfabricated atom-chip system which provides a magnetic trap for capturing a gas made up of thousands of ultracold atoms. However, in studying interactions between light and mechanical motion, it has been a major challenge to distinguish those effects that are distinctly quantum from those that are classical -- a distinction critical to the future exploitation of optomechanics. Already cavity optomechanics has led to applications such as the cooling of objects to their motional ground state, and detections of force and motion on the attometer scale. A second beam of light is used for the pump/probe This ensemble of ultracold atoms is then transferred into an optical cavity (Fabry-Pferot) where it is trapped in a one-dimensional optical lattice formed by near-infrared (850 nanometer wavelength) light that resonates with the cavity.

"Integrating trapped ensembles of ultracold atoms and high-finesse cavities with an atom chip allowed us to study and control the classical and quantum interactions between photons and the internal/external degrees of freedom of the atom ensemble," Brooks says. "In contrast to typical solid-state mechanical systems, our optically levitated ensemble of ultracold atoms is isolated from its environment, causing its motion to be driven predominantly by quantum radiation-pressure fluctuations."

The Berkeley research team first applied classical light modulation to a low-powered pump/probe beam (36 picoWatts) entering their optical cavity to demonstrate that their system behaves as a high-gain parametric optomechanical amplifier. They then extinguished the classical drive and mapped the response to the fluctuations of the vacuum. This enabled them to observe light being squeezed by its interaction with the vibrating ensemble and the atomic motion driven by the light's quantum fluctuations. Amplification and this squeezing interaction, which is called "ponderomotive force," have been long-sought goals of optomechanics research.

"Parametric amplification typically requires a lot of power in the optical pump but the small mass of our ensemble required very few photons to turn the interactions on/off," Brooks says. "The ponderomotive squeezing we saw, while narrow in frequency, was a natural consequence of having radiation-pressure shot noise dominate in our system."

Since squeezing light improves the sensitivity of gravitational wave detectors, the ponderomotive squeezing effects observed by Brooks, Stamper-Kern and their colleagues could play a role in future detectors. The idea behind gravitational wave detection is that a ripple in the local curvature of spacetime caused by a passing gravitational wave will modify the resonant frequency of an optical cavity which, in turn, will alter the cavity's optical signal.

"Currently, squeezing light over a wide range of frequencies is desirable as scientists search for the first detection of a gravitational wave," Brooks explains. "Ponderomotive squeezing, should be valuable later when specific signals want to be studied in detail by improving the signal-to-noise ratio in the specific frequency range of interest."

The results of this study differ significantly from standard linear model predictions. This suggests that a nonlinear optomechanical theory is required to account for the Berkeley team's observations that optomechanical interactions generate non-classical light. Stamper-Kern's research group is now considering further experiments involving two ensembles of ultracold atoms inside the optical cavity.

"The squeezing signal we observe is quite small when we detect the suppression of quantum fluctuations outside the cavity, yet the suppression of these fluctuations should be very large inside the cavity," Brooks says. "With a two ensemble configuration, one ensemble would be responsible for the optomechanical interaction to squeeze the radiation-pressure fluctuations and the second ensemble would be studied to measure the squeezing inside the cavity."

This research was funded by the Air Force Office of Scientific Research and the National Science Foundation.

Wednesday, 29 August 2012

Record-breaking galaxy cluster found

A massive so-called galaxy cluster, one of the largest structures in the universe, has been discovered about 5.7 billion light years from Earth, US researchers have announced.

The Harvard-Smithsonian Center for Astrophysics says the observations of the cluster, which has shown a prodigious rate of star formation, may force astronomers to rethink how such colossal structures and galaxies that inhabit them evolve over time.

Known officially as SPT-CLJ2344-4243, the cluster has been nicknamed 'Phoenix', after the mythological bird that rose from the dead.

That's partly due to the constellation in which it lies. But Michael McDonald, a Hubble fellow at the Massachusetts Institute of Technology, says the Phoenix was also a great way of thinking about the latest astronomical marvel.

"While galaxies at the centre of most clusters may have been dormant for billions of years, the central galaxy in this cluster seems to have come back to life with a new burst of star formation," says McDonald, the lead author of the paper appearing in the journal Nature.

Based on observations from NASA's Chandra X-ray Observatory, the US National Science Foundation's South Pole Telescope and eight other observatories, researchers says the centre of the Phoenix cluster had been linked to the creation of about "740 solar masses" or stars a year.

By comparison, the Perseus cluster forms stars at a rate about 20 times slower than Phoenix.

"This is just an enormous rate," says Marie Machacek, an astrophysicist at the Smithsonian Astrophysical Observatory. She says huge clusters like Phoenix are thought to host thousands of galaxies and there was still a lot to learn about what goes on within them.

Supermassive black holes in the central galaxy of a cluster have long been associated with low observed star formation rates, as they pump energy into the system and prevent the cooling of gases needed for the creation of stars.

But researchers say the "massive starburst" seen in Phoenix, as it gave birth to about two stars per day, suggests that its central galaxy's black hole had failed to interfere with an extremely strong cooling flow.

"Stars are forming in the Phoenix cluster at the highest rate ever observed for the middle of a galaxy cluster," according to a Harvard-Smithsonian Center for Astrophysics press release.

"The object also is the most powerful producer of x-rays of any known cluster and among the most massive. The data also suggest the rate of hot gas cooling in the central regions of the cluster are the largest ever observed."

Monday, 27 August 2012

7 nations face sanctions over endangered species

Seven nations may lose their ability to legally trade tens of thousands of wildlife species after U.N. conservation delegates agreed Thursday to penalize them for lacking tough regulations or failing to report on their wildlife trade.

They would prevent the countries from legally trading in any of the 35,000 species regulated by the 175-nation Convention on International Trade in Endangered Species, said Juan Carlos Vasquez, a spokesman for the U. 1. The suspensions against the seven nations — Comoros, Guinea-Bissau, Paraguay, Nepal, Rwanda, Solomon Islands and Syria — were approved by consensus among the delegates and would take effect Oct. N. Environment Program in Geneva, agreed to trade suspensions against Comoros, Guinea-Bissau, Paraguay and Rwanda based on their lack of national laws for regulating the lucrative wildlife trade office that administers the treaty. Delegations to the weeklong meeting of CITES, a treaty overseen by the U. N.

The other 3 percent are generally prohibited 1. The Geneva meeting's attendees also agreed to trade suspensions against Guinea-Bissau, Nepal, Rwanda, Solomon Islands and Syria based on their failure to adequately report what they are doing to regulate wildlife trade, as they are required to do under the CITES treaty. According to CITES, about 97 percent of the species it regulates are commercially traded for food, fuel, forest products, building materials, clothing, ornaments, health care, religious items, collections, trophy hunting and other sport. To avoid the sanctions, and the prospect of losing millions of dollars in commerce, the seven must now draw up the required legislation or submit their missing annual reports to CITES by Oct.

traded. But the multibillion-dollar illegal trade in wildlife is a growing problem, and environmentalists say a big reason is nations' failure to enact stiff penalties for traffickers or enforce wildlife laws already on the books. The delegates are expected to consider on Friday a more controversial topic: a call to resume the legal ivory trade as a way to stop the recent rise in elephant poaching in Africa 2 billion over the five years from 2006 to 2010. During that time, logging of big leaf mahogany alone accounted for $168 million in trade. CITES estimates the regulated global wildlife trade is between $350 million and $530 million a year, or almost $2. TRAFFIC, a wildlife trade monitoring network, estimates that commercial trade in wildlife has risen sharply from around $160 billion a year in the early 1990s.

That proposal, put forward in a CITES-commissioned report, would set up a centralized system to allow for the sale of ivory from elephants that either died naturally or as a result of trophy hunting, or were considered a threat or culled for ecological reasons.

It is the first time such a proposal has been made since a global ban on ivory went into effect in 1989. That ban mostly halted widespread poaching, but in the past decade the problem has worsened owing mainly to an Asian appetite for ivory chopsticks, statues and jewelry.

The rise in rhino poaching also is on the agenda.

Experts rank wildlife smuggling among the top aims of criminal networks, along with drugs and human trafficking. CITES says wildlife crime remains poorly studied, but it says international estimates of the scale of illegal wildlife trade range from between $16 billion and $27 billion a year.

Tiger parts, elephant ivory, rhino horn and exotic birds and reptiles are among the most trafficked items. To fight it, CITES has formed a consortium with Interpol, the U.N. office on drugs and crime, the World Bank and the World Customs Organization.

Saturday, 25 August 2012

Kepler spots 'perfectly aligned' alien worlds

Astronomers have confirmed that our solar system isn't unique, after the discovery of a planetary system that is as flat and orderly as our own.

That idea can be laid to rest thanks to an innovative use of the Kepler data which aligned three planets circling the Sun-like star Kepler-30 with a giant spot on the star's surface. "The planets themselves are not all that remarkable - two giant Jupiters and one super-Earth - but what is remarkable is that they aligned so perfectly," says astronomer Drake Deming of the University of Maryland When NASA's Kepler space telescope started finding planets at odd angles to their parent stars, scientists wondered if our solar system's tidy geometry, with the planets neatly orbiting around the Sun's equator, was an exception to the rule. That finding is an indication that Kepler-30, like our own solar system, formed from a rotating disk of gas. The study showed the trio of planets orbiting within one degree, relative to each other and relative to the star's equator.

"The dynamics of the system are important for the possible development of life," he says.

The alignment of the Kepler-30 brood is the most precise found yet.

The Kepler telescope is studying about 150,000 Sun-like stars for signs of Earth-like planets. Multi-planet systems would have to be somewhat aligned to fall into the telescope's narrow and deep field of view.

Kepler's targets are all hundreds to thousands of light years away.

Future missions to probe stars closer to Earth most likely would need a wider-angle view, so using starspots as reference points could be a valuable tool for homing in on systems geometrically similar to ours.

"By chance, and because we have good data on our hands, we came up with the idea to measure obliquity (slant) with spots," says lead researcher Roberto Sanchis-Ojeda of the Massachusetts Institute of Technology.

Thursday, 23 August 2012

Math Shows How Shockwaves Could Crinkle Space

Mathematicians at UC Davis have come up with a new way to crinkle up the fabric of space-time -- at least in theory.

"We show that space-time cannot be locally flat at a point where two shock waves collide," said Blake Temple, professor of mathematics at UC Davis. "This is a new kind of singularity in general relativity."

Temple and his collaborators study the mathematics of how shockwaves in a perfect fluid can affect the curvature of space-time in general relativity. A singularity is a patch of space-time that cannot be made to look flat in any coordinate system, Temple said. One example of a singularity is inside a black hole, where the curvature of space becomes extreme. The results are reported in two papers by Temple with graduate students Moritz Reintjes and Zeke Vogler, respectively, both published in the journal Proceedings of the Royal Society A. In earlier work, Temple and collaborator Joel Smoller, Lamberto Cesari professor of mathematics at the University of Michigan, produced a model for the biggest shockwave of all, created from the Big Bang when the universe burst into existence But the theory starts from the assumption that any local patch of space-time looks flat, Temple said. Einstein's theory of general relativity explains gravity as a curvature in space-time.

Vogler's doctoral work used mathematics to simulate two shockwaves colliding, while Reintjes followed up with an analysis of the equations that describe what happens when shockwaves cross. "

What is surprising is that something as mild as interacting waves could create something as extreme as a space-time singularity, Temple said But it has been known since the 1960s that the jump in curvature created by a single shock wave is not enough to rule out the locally flat nature of space-time. A shockwave creates an abrupt change, or discontinuity, in the pressure and density of a fluid, and this creates a jump in the curvature. He found this created a new type of singularity, which he dubbed a "regularity singularity.

Temple and his colleagues are investigating whether the steep gradients in the space-time fabric at a regularity singularity could create any effects that are measurable in the real world. For example, they wonder whether they might produce gravity waves, Temple said. General relativity predicts that these are produced, for example, by the collision of massive objects like black holes, but they have not yet been observed in nature. Regularity singularities could also be formed within stars as shockwaves pass within them, the researchers theorize.

Reintjes, now a postdoctoral scholar at the University of Regensburg, Germany presented the work at the International Congress on Hyperbolic Problems in Padua, in June.

Tuesday, 21 August 2012

Understanding Hot Nuclear Matter That Permeated the Early Universe

A review article appearing in the July 20, 2012, issue of the journal Science describes groundbreaking discoveries that have emerged from the Relativistic Heavy Ion Collider (RHIC) at the U.S. Department of Energy's Brookhaven National Laboratory, synergies with the heavy-ion program at the Large Hadron Collider (LHC) in Europe, and the compelling questions that will drive this research forward on both sides of the Atlantic. With details that help enlighten our understanding of the hot nuclear matter that permeated the early universe, the article is a prelude to the latest findings scientists from both facilities will present at the next gathering of physicists dedicated to this research -- Quark Matter 2012, August 12-18 in Washington, D.C.

The temperatures achieved in these collisions -- more than 4 trillion degrees Celsius, the hottest ever created in a laboratory -- briefly liberate the subatomic quarks and gluons that make up protons and neutrons of ordinary atomic nuclei so scientists can study their properties and interactions. "Nuclear matter in today's universe hides inside atomic nuclei and neutron stars," begin the authors, Barbara Jacak, a physics professor at Stony Brook University and spokesperson for the PHENIX experiment at RHIC, and Berndt Mueller, a theoretical physicist at Duke University. Collisions between heavy ions at machines like RHIC, running since 2000, and more recently, the LHC, make this hidden realm accessible by recreating the extreme conditions of the early universe on a microscopic scale. " "Understanding the evolution of our universe thus requires knowledge of the structure and dynamics of these particles in their purest form, a primordial 'soup' known as quark-gluon plasma (QGP). "Quarks and the gluons that hold them together are the building blocks of all the visible matter that exists in the universe today -- from stars, to planets, to people," Jacak said.

RHIC was the first machine to demonstrate the formation of quark-gluon plasma, and determine its unexpected properties. "Understanding strongly coupled or strongly correlated systems is at the intellectual forefront of multiple subfields of physics," the authors write. The findings at RHIC have unanticipated connections to several of these, including conventional plasmas, superconductors, and even some atoms at the opposite extreme of the temperature scale -- a minute fraction of a degree above absolute zero -- which also behave as a nearly perfect fluid with vanishingly low viscosity when confined within an atomic trap Instead of an ideal gas of weakly interacting quarks and gluons, the QGP discovered at RHIC behaves like a nearly frictionless liquid. This matter's extremely low viscosity (near the lowest theoretically possible), its ability to stop energetic particle jets in their tracks, and its very rapid attainment of such a high equilibrium temperature all suggest that the fluid's constituents are quite strongly interacting, or coupled.

This flexibility at RHIC allows scientists to produce QGP under a wide variety of initial conditions, and thereby to distinguish intrinsic QGP properties from the influence of the initial conditions Although the mathematics is clear and well established, the physical reasons for the relationship are still a deep mystery. RHIC can go to lower energies and collide a wide range of ions from protons, to copper, to gold, to uranium -- and produce asymmetric collisions between two different kinds of ions. When the LHC began its first heavy ion experiments in 2010 -- at nearly 14 times higher energy than RHIC's -- they largely confirmed RHIC's pioneering findings with evidence of a strongly coupled, low-viscosity liquid, albeit at a temperature about 30 percent higher than at RHIC. "Physicists were astounded," the authors note. Another stunning surprise was that mathematical approaches using methods of string theory and theoretical black holes occupying extra dimensions could be used to describe some of these seemingly unrelated strongly coupled systems, including RHIC's nearly perfect liquid. With a higher energy range, LHC offers a higher rate of rare particles, such as heavy (charm and bottom) quarks, and high- energy jets that can probe particular properties of the QGP system.

"The two facilities are truly complementary," said Mueller, whose work on quantum chromodynamics (QCD), the theory that describes the interactions of quarks and gluons, helps guide experiments and interpret results at both facilities. "Both RHIC and the LHC are essential to advancing our understanding of the subatomic interactions that governed the early universe, and how those gave form to today's matter as they coalesced into more ordinary forms."

An essential part of the experimental and theoretical research path going forward will be a detailed exploration of the nuclear "phase diagram" -- how quark matter evolves over a range of energies, temperatures, and densities. LHC will search the highest range of energies, where the matter produced contains quarks and antiquarks in almost complete balance. But all evidence to date from both colliders suggests that RHIC is in the energy "sweet spot" for exploring the transition from ordinary matter to QGP -- analogous to the way an ordinary substance like water changes phases from ice to liquid water to gas.

"It's extremely gratifying that our experimental program has succeeded so beautifully so far. The connections with other areas of physics are intriguing, and the results are turning out to be even more interesting than we expected," Jacak said.

Sunday, 19 August 2012

Scientists Prove DNA Can Be Reprogrammed by Words and Frequencies

Russian scientific research directly or indirectly explains phenomena such as clairvoyance, intuition, spontaneous and remote acts of healing, self healing, affirmation techniques, unusual light/auras around people (namely spiritual masters), mind’s influence on weather patterns and much more. In addition, there is evidence for a whole new type of medicine in which DNA can be influenced and reprogrammed by words and frequencies WITHOUT cutting out and replacing single genes THE HUMAN DNA IS A BIOLOGICAL INTERNET and superior in many aspects to the artificial one.

The Russian linguists found that the genetic code, especially in the apparently useless 90%, follows the same rules as all our human languages. ” Their results, findings and conclusions are simply revolutionary! Only 10% of our DNA is being used for building proteins. They found that the alkalines of our DNA follow a regular grammar and do have set rules just like our languages. To this end they compared the rules of syntax (the way in which words are put together to form phrases and sentences), semantics (the study of meaning in language forms) and the basic rules of grammar. It is this subset of DNA that is of interest to western researchers and is being examined and categorized. So human languages did not appear coincidentally but are a reflection of our inherent DNA According to them, our DNA is not only responsible for the construction of our body but also serves as data storage and in communication. ” The Russian researchers, however, convinced that nature was not dumb, joined linguists and geneticists in a venture to explore those 90% of “junk DNA. The other 90% are considered “junk DNA.

The Russian biophysicist and molecular biologist Pjotr Garjajev and his colleagues also explored the vibrational behavior of the DNA. ” This means that they managed for example to modulate certain frequency patterns onto a laser ray and with it influenced the DNA frequency and thus the genetic information itself. Since the basic structure of DNA-alkaline pairs and of language (as explained earlier) are of the same structure, no DNA decoding is necessary. ] The bottom line was: “Living chromosomes function just like solitonic/holographic computers using the endogenous DNA laser radiation. [For the sake of brevity I will give only a summary here. This, too, was experimentally proven! Living DNA substance (in living tissue, not in vitro) will always react to language-modulated laser rays and even to radio waves, if the proper frequencies are being used For further exploration please refer to the appendix at the end of this article. One can simply use words and sentences of the human language!

While western researchers cut single genes from the DNA strands and insert them elsewhere, the Russians enthusiastically worked on devices that can influence the cellular metabolism through suitable modulated radio and light frequencies and thus repair genetic defects. So they successfully transformed, for example, frog embryos to salamander embryos simply by transmitting the DNA information patterns! This experiment points to the immense power of wave genetics, which obviously has a greater influence on the formation of organisms than the biochemical processes of alkaline sequences This finally and scientifically explains why affirmations, autogenous training, hypnosis and the like can have such strong effects on humans and their bodies. This way the entire information was transmitted without any of the side effects or disharmonies encountered when cutting out and re-introducing single genes from the DNA. Garjajev’s research group succeeded in proving that with this method chromosomes damaged by x-rays for example can be repaired. It is entirely normal and natural for our DNA to react to language. This represents an unbelievable, world-transforming revolution and sensation! All this by simply applying vibration and language instead of the archaic cutting-out procedure! They even captured information patterns of a particular DNA and transmitted it onto another, thus reprogramming cells to another genome.

Esoteric and spiritual teachers have known for ages that our body is programmable by language, words and thought. This has now been scientifically proven and explained. Of course the frequency has to be correct. And this is why not everybody is equally successful or can do it with always the same strength. The individual person must work on the inner processes and maturity in order to establish a conscious communication with the DNA. The Russian researchers work on a method that is not dependent on these factors but will ALWAYS work, provided one uses the correct frequency.

But the higher developed an individual’s consciousness is, the less need is there for any type of device! One can achieve these results by oneself, and science will finally stop to laugh at such ideas and will confirm and explain the results. And it doesn’t end there.?The Russian scientists also found out that our DNA can cause disturbing patterns in the vacuum, thus producing magnetized wormholes! Wormholes are the microscopic equivalents of the so-called Einstein-Rosen bridges in the vicinity of black holes (left by burned-out stars).? These are tunnel connections between entirely different areas in the universe through which information can be transmitted outside of space and time. The DNA attracts these bits of information and passes them on to our consciousness. This process of hyper communication is most effective in a state of relaxation. Stress, worries or a hyperactive intellect prevent successful hyper communication or the information will be totally distorted and useless.

Friday, 17 August 2012

Mystery of the 'Monster Stars' Solved: It Was a Monster Mash

A gaggle of monsters resides in the Tarantula Nebula, part of a nearby galaxy.

Scientists discovered four monstrously heavy stars there in 2010. With masses up to 300 times that of our sun, they have twice the mass that astronomers believed to be the upper limit for stars, confounding the known models of star formation and begging the question: how did these monstrosities become so gargantuan? "Imagine two bulky stars closely circling each other but where the duo gets pulled apart by the gravitational attraction from their neighboring star," said lead investigator Sambaran Banerjee, an astronomer at the University of Bonn in Germany, in a press release. In other words, it was a monster mash. "If their initially circular orbit is stretched enough, then the stars crash into each other as they pass and make a single ultramassive star. " Now, new calculations reveal that the stars could have been created when pairs of lighter stars that were orbiting one another in a binary star system crashed together and merged.

Banerjee and colleagues computer-modeled the interactions between stars in an R136-like cluster — R136 being the stellar nursery inside the Tarantula Nebula where the four ultramassive stars arose. [Tarantula Nebula's Star-Forming Turbulence Exposed]

The researchers' R136-like cluster model contained more than 170,000 stars, all of which started out with normal mass and which were distributed throughout space in the expected way. Cracking the mystery required a truly monstrous calculation. To compute how this system changes over time, the computer simulation had to solve a system of 510,000 equations many times over, accounting for such effects as gravity, the nuclear reactions and hence energy released by each star, and what happens when two stars collide The Tarantula Nebula, a 1,000-light-year-diameter cloud of gas and dust also known as the "30 Doradus" (30 Dor) complex, is itself located in the Large Magellanic Cloud, the third closest galaxy to the Milky Way.

At some point, the gravitational pull of nearby stars threw their orbits for a loop, causing the pair to smack together The researchers used an N-body integration code developed primarily by an astronomer at Cambridge, and found a novel way of speeding up their calculations using video-gaming cards installed in otherwise ordinary computers. "With all these ingredients, our R136 models are the most difficult and intensive N-body calculations ever made," said Pavel Kroupa and Seungkyung Oh, members of the research team, referring to the highly intensive star-by-star calculations used to accurately model any number (N) of bodies (stars). Each started out as a binary pair of bulky but ordinary stars, no heavier than the universal limit of 150 solar masses. Presenting their results in an upcoming issue of the journal Monthly Notices of the Royal Astronomical Society, the Bonn group found that "monster stars" formed in their model R136-like cluster.

"Although extremely complicated physics is involved when two very massive stars collide," Banerjee said, "we still find it quite convincing that this explains the monster stars seen in the Tarantula."

He added, "This helps us relax, because the collisions mean that the ultramassive stars are a lot easier to explain. The universality of star formation prevails after all."

Wednesday, 15 August 2012

Next Magnetic Pole Reversal Is Underway

A new study indicates that there is a possible connection between the Earth's inner core and a magnetic reversal. Peter Olson and Renaud Deguen of Johns Hopkins University in Baltimore, Maryland, used numerical modelling to establish that the axis of Earth's magnetic field lies in the growing hemisphere. Now the researchers speculate that there are signs that the next magnetic pole reversal may be underway The magnetic field reverses direction every few thousand years. While one side of Earth's solid inner core grows slightly, the other half melts, the scientists concluded in their research paper. If it happened now, we would be exposed to solar winds capable of knocking out global communications and power grids.

The rapid movements of the field's axis to the east in the last few hundred years could be a precursor to the north and south poles trading places, the researchers suggest. "We kind of speculate there is that connection but the chaos in the core is going to prevent us from making accurate predictions for a long time. "What we found that is interesting in our models is a correlation between these transient [shifts] and reversals [of Earth's magnetic field]," says Olson. "

Bruce Buffett of the University of California, Berkeley, says the authors present an intriguing proof of concept with their model. "They are suggesting very cautiously that maybe this rapid change is somehow suggestive of us going into a reversal event," he says.

"You could imagine if the field were to collapse it would have disastrous consequences for communication systems and power grids."

How Much Should We Fear Incoming Solar Activity?

Moreover, even with a weakened magnetic field, Earth's thick atmosphere also offers protection against the sun's incoming particles A weaker field would certainly lead to a small increase in solar radiation on Earth - as well as a beautiful display of aurora at lower latitudes -- but nothing deadly. But, while Earth's magnetic field can indeed weaken and strengthen over time, there is no indication that it has ever disappeared completely. According to NASA, it is a mistake to assume that a pole reversal would momentarily leave Earth without the magnetic field that protects us from solar flares and coronal mass ejections from the sun.

Movement of Earth's North Magnetic Pole Accelerating Rapidly

After some 400 years of relative stability, Earth's North Magnetic Pole has moved nearly 1,100 kilometers out into the Arctic Ocean during the last century and at its present rate could move from northern Canada to Siberia within the next half-century.

If that happens, Alaska may be in danger of losing one of its most stunning natural phenomena - the Northern Lights.

However, rapid movement of the magnetic pole doesn't necessarily mean that our planet is going through a large-scale change that would result in the reversal of the Earth's magnetic field. It may also be part of a normal oscillation. Calculations of the North Magnetic Pole's location from historical records goes back only about 400 years, while polar observations trace back to John Ross in 1838 at the west coast of Boothia Peninsula.

No Reason To Panic

Earth's magnetic field has flipped its polarity many times over the millennia and reversals are the rule, not the exception.

Earth has settled in the last 20 million years into a pattern of a pole reversal about every 200,000 to 300,000 years, although it has been more than twice that long since the last reversal. A reversal happens over hundreds or thousands of year, and not over night.

This means a magnetic pole reversal is not a sign of doomsday. 

Monday, 13 August 2012

Vaporizing Earth in Computer Simulations to Aid Search for Super-Earths

In science fiction novels, evil overlords and hostile aliens often threaten to vaporize Earth. At the beginning of The Hitchhikers Guide to the Galaxy, the officiously bureaucratic aliens called Vogons, authors of the third-worst poetry in the universe, actually follow through on the threat, destroying Earth to make way for a hyperspatial express route.

"We scientists are not content just to talk about vaporizing the Earth," says Bruce Fegley, professor of earth and planetary sciences at Washington University in St. Louis, tongue firmly in cheek. "We want to understand exactly what it would be like if it happened."

And in fact Fegley, PhD, and his colleagues Katharina Lodders, PhD, a research professor of earth and planetary sciences who is currently on assignment at the National Science Foundation, and Laura Schaefer, currently a graduate student at Harvard University, have vaporized Earth -- if only by simulation, that is mathematically and inside a computer.

They weren't just practicing their evil overlord skills. By baking model Earths, they are trying to figure out what astronomers should see when they look at the atmospheres of super-Earths in a bid to learn the planets' compositions.

Super-earths are planets outside our solar system (exoplanets) that are more massive than Earth but less massive than Neptune and made of rock instead of gas. Because of the techniques used to find them, most of the detected super-Earths are those which orbit close to their stars -- within rock-melting distance.

Their NSF- and NASA-funded research, described in the August 10 issue of The Astrophysical Journal, show that Earth-like planets as hot as these exoplanets would have atmospheres composed mostly of steam and carbon dioxide, with smaller amounts of other gases that could be used to distinguish one planetary composition from another.

The WUSTL team is collaborating with Dr. Mark Marley's research group at the NASA Ames Research Center to convert the gas abundances they have calculated into synthetic spectra the planet hunters can compare to spectra they measure.

Motivated by degeneracy

Under favorable circumstances planet hunting techniques allow astronomers not just to find exoplanets but also to measure their average density.

The average density together with theoretical models lets the astronomers figure out the bulk chemical composition of gas giants, but in the case of rocky planets the possible variety of rocky ingredients can often add up several different ways to the same average density.

This is an outcome scientists, who would prefer one answer per question, call degeneracy.

If a planet passes in front of its star, so that astronomers can observe the light from the star filtered by the planet's atmosphere, they can determine the composition of the planet's atmosphere, which allows them to distinguish about alternative bulk planetary compositions.

"It's not crazy that astronomers can do this and more people are looking at the atmospheres of these transiting exoplanets," Fegley says. "Right now, there are eight transiting exoplanets where astronomers have done some atmospheric measurements and more will probably be reported in the near future."

"We modeled the atmospheres of hot super-Earths because that's what astronomers are finding and we wanted to predict what they should be looking for when they look at the atmospheres to decipher the nature of the planet," Fegley says.

Two model Earths Even though the planets are called super-Earths, Fegley says, the term is a reference to their mass and makes no claim about their composition, much less their habitability. But, he says, you start with what you know.

The team ran calculations on two types of pseudo-Earths, one with a composition like that of Earth's continental crust and the other, called the BSE (bulk silicate Earth), with a composition like Earth's before the continental crust formed, which is the composition of the silicate portion of the primitive Earth before the crust formed.

The difference between the two models, says Fegley, is water. Earth's continental crust is dominated by granite, but you need water to make granite. If you don't have water, you end up with a basaltic crust like Venus. Both crusts are mostly silicon and oxygen, but a basaltic crust is richer in elements such as iron and magnesium.

Fegley is quick to admit Earth's continental crust is not a perfect analog for lifeless planets because it has been modified by the presence of life over the past four billion years, which both oxidized the crust and also led to production of vast reservoirs of reduced carbon, for example in the form of coal, natural gas, and oil.

Raining acid and rock

The super-Earths the team used as references are thought to have surface temperatures ranging from about 270 to 1700 degrees Celsius (C), which is about 520 to 3,090 degrees F. The Earth, in contrast, has a global average surface temperature of about 15 degrees C (59 degrees F) and the oven in your kitchen goes up to about 450 Fahrenheit.

Using thermodynamic equilibrium calculations, the team determined which elements and compounds would be gaseous at these alien temperatures.

"The vapor pressure of the liquid rock increases as you heat it, just as the vapor pressure of water increases as you bring a pot to boil," Fegley says. "Ultimately this puts all the constituents of the rock into the atmosphere."

The continental crust melts at about 940 C (1,720 F), Fegley says, and the bulk silicate Earth at roughly 1730 C (3,145 F). There are also gases released from the rock as it heats up and melts.

Their calculations showed that the atmospheres of both model Earths would be dominated over a wide temperature range by steam (from vaporizing water and hydrated minerals) and carbon dioxide (from vaporizing carbonate rocks).

The major difference between the models is that the BSE atmosphere is more reducing, meaning that it contains gases that would oxidize if oxygen were present. At temperatures below about 730 C (1,346 F) the BSE atmosphere, for example, contains methane and ammonia.

That's interesting, Fegley says, because methane and ammonia, when sparked by lighting, combine to form amino acids, as they did in the classic Miller-Urey experiment on the origin of life.

At temperatures above about 730 C, sulfur dioxide would enter the atmosphere, Fegley says. "Then the exoplanet's atmosphere would be like Venus's, but with steam," Fegley says.

The gas most characteristic of hot rocks, however, is silicon monoxide, which would be found in the atmospheres of both types of planets at temperatures of 1,430 C (2,600 F) or higher.

This leads to amusing possibility that as frontal systems moved through this exotic atmosphere, the silicon monoxide and other rock-forming elements might condense and rain out as pebbles.

Asked whether his team ever cranked the temperature high enough to vaporize the entire Earth, not just the crust and the mantle, Fegley admits that they did.

"You're left with a big ball of steaming gas that's knocking you on the head with pebbles and droplets of liquid iron," he says. "But we didn't put that into the paper because the exoplanets the astronomers are finding are only partially vaporized," he says.

Original story on:

Saturday, 11 August 2012

Our universe is nothing more than a holographic representation of the “Real” universe

This “hologram” exists in the event horizon of a black hole.

In other words, are we real, or are we quantum interactions on the edges of the universe - and is that just as real anyway What if our existence is a holographic projection of another, flat version of you living on a two-dimensional "surface" at the edge of this universe?

Whether we actually live in a hologram is being hotly debated,  but it is now becoming clear that looking at phenomena through a holographic lens could be key to solving some of the most perplexing problems in physics, including the physics that reigned before the big bang,what gives particles mass, a theory of quantum gravity. In fact, unless you are a physicist you probably have never even heard Aspect's name, though increasing numbers of experts believe his discovery may change the face of science In 1982 a litttle known but epic  event occured at the University of Paris, where a research team led by physicist Alain Aspect performed what may turn out to be one of the most important experiments of the 20th century. You did not hear about it on the Daily Show.

Since traveling faster than the speed of light is tantamount to breaking the time barrier, this daunting prospect has caused some physicists to try to come up with increasingly elaborate ways to explain away Aspect's findings Somehow each particle always seems to know what the other is doing. It doesn't matter whether they are 10 feet or 10 billion miles apart. The problem with this feat is that it violates Einstein's long-held tenet that no communication can travel faster than the speed of light. Aspect and his team discovered that under certain circumstances subatomic particles such as electrons are able to instantaneously communicate with each other regardless of the distance separating them.

When the film is developed, it looks like a meaningless swirl of light and dark lines. Then a second laser beam is bounced off the reflected light of the first and the resulting interference pattern (the area where the two laser beams conflate) is captured on film. Bohm was involved in the early development of the holonomic model of the functioning of the brain, a model for human cognition that is drastically different from conventionally accepted ideas. But as soon as the developed film is illuminated by another laser beam, a three-dimensional image of the original object appears To understand why Bohm makes this startling assertion, one must first understand that a hologram is a three- dimensional photograph made with the aid of a laser. To make a hologram, the object to be photographed is first bathed in the light of a laser beam. University of London physicist David Bohm, for example, believes Aspect's findings imply that objective reality does not exist, that despite its apparent solidity the universe is at heart a phantasm, a gigantic and splendidly detailed hologram. Bohm developed the theory that the brain operates in a manner similar to a hologram, in accordance with quantum mathematical principles and the characteristics of wave patterns.

In a recent collaboration between Fermilab scientists and hundreds of meters of laser may have found the very pixels of reality, grains of spacetime one tenth of a femtometer across. GEO600's length means it can measure changes of one part in six hundred million, accurate enough to detect even the tiniest ripples in space time -  assuming it isn't thrown off by somebody sneezing within a hundred meters or the wrong types of cloud overhead (seriously). The problem with such an incredibly sensitive device is just that - it's incredibly sensitive The GEO600 system is armed with six hundred meters of laser tube, which sounds like enough to equip an entire Star War, but these lasers are for detection, not destruction.

The quantum limit of reality, the Planck length, occurs at a far smaller length scale than their signal - but according to Hogan, this literal ultimate limit of tininess might be scaled up because we're all holograms. We'd like to remind you that although we're talking about "The GEO600 Laser Team probing the edge of reality", this is not a movie The idea is that all of our spatial dimensions can be represented by a 'surface' with one less dimension, just like a 3D hologram can be built out of information in 2D foils. Obviously. The foils in our case are the edges of the observable universe, where quantum fluctuations at the Planck scale are 'scaled up' into the ripples observed by the GEO600 team. The interferometer staff constantly battle against unwanted aberration, and were struggling against a particularly persistent signal when Fermilab Professor Craig Hogan suggested the problem wasn't with their equipment but with reality itself.

What does this mean for you?  In everyday action, nothing much - we're afraid that a fundamentally holographic nature doesn't allow you to travel around playing guitar and fighting crime (no matter what 80s cartoons may have taught you.)  Whether reality is as you see it, or you're the representation of interactions on a surface at the edge of the universe, getting run over by a truck (or a representation thereof) will still kill you.

In intellectual terms, though, this should raise so many fascinating questions you'll never need TV again.  While in the extreme earliest stages, with far more work to go before anyone can draw any conclusions, this is some of the most mind-bending metaphysical science you'll ever see.

Thursday, 9 August 2012

Scientists will provide the energy the Earth 5,000 years in advance

Doctor of physical-mathematical Nauck, Director of Research of the Moon Astronomical Institute of Moscow State University, Vladislav Shevchenko believes that the main energy source on Earth can become a helium-3, which has a large inventory on the moon.

The cost of one ton of that element will be presented about a billion dollars, and 25 tons of helium-3 is sufficient to provide energy to the Earthlings a year.

Now the isotopes on Earth is in small quantities, only a few tens of grams per year. On the moon the maximum budget is around 500,000 tons of helium-3.

Tuesday, 7 August 2012

The man is a genetic error

More than 500 million years of invertebrates on the ocean floor have survived two consecutive scale DNA duplication - there was a "mistake" that ultimately resulted in the evolution of humans and many other animals, according to new research by Scottish scientists.

From the conclusions of scientists from the study of proteins implies that man is an organism that has undergone mutation was kopljas (Cephalochordata) and thus arrive at determining the time of these mutations.

The study of this issue explains the emergence and development of many diseases typical of the man, and future research will probably help in their treatment and prediction.

Sunday, 5 August 2012

Greenland is melting

As many as 97 percent of the surface of Greenland melts - it is said the U.S. space agency Nasa. The proportions are higher than in the past 30 years that the agency follows the melting of the island via satellite.

The data come from three different satellites, which are studied and university stru─Źnajci Our experts. During the summer, with average temperatures melt about half ice layers in Greenland. But for the most part the ice is not disappearing: At high altitudes the main part of the water quickly frozen in ice near the shore of a barrier preventing the swelling of surface water into the ocean.

Unusually warm weather

According to satellite imagery, dissolved in a few days is about 97 percent of the surface layer of ice. The data seemed so unreal, that scientists at first thought it was a mistake. Measuring instruments on satellites have checked the results, when they were confirmed by the high temperature above the ice surface. Unusually warm air layer could be the cause of the extremely strong warming.

"People began to notice that it is incredibly hot in the coastal areas. Still can not see the ice melt. But if you look at these landscapes, they will be covered with snow and now we know that there is water between the snow particles. It is only the upper region. In the lower regions is melting more and thus create rivers and lakes, "says Thomas Wagner, Our scientist from Washington, which sees the main reason for the high temperatures.

Greenland is not going away?

Thomas Wagner points out that melting of the Greenland ice layer has a significant impact on the sea and the environment: "Greenland is losing huge amounts of ice. Global sea level rises about three millimeter per year, less than half a millimeter comes from Greenland. This is probably caused by warming oceans around the island. There is not only about the surface layer, and we think this is probably a natural change. Of course, if such a drastic warming is repeated, then we will be concerned, "says Wagner.

Friday, 3 August 2012

Largest telescope in the world installed

In Namibia, is set in the world's largest telescope H. E. S. S. II, an area of ​​600 square meters, it was announced from the University of Hamburg. The press release also states that the devices help scientists to investigate the origin of cosmic rays.

The telescope will monitor the streams of particles of high charge in the cosmic space, and also to collect information on galactic and galactic sources of gamma rays.

Wednesday, 1 August 2012

Smart food will replace diets

Scientists plan to create add-ons, which will affect the human brain to early satiety feeling. Carry out an international research team working on the project Full4Health, the Daily Mail.

The work of scientists was initiated by the European Union, in order to teach man a moderate diet.

"Smart" meal will contain special ingredients that make you feel full does not come too late, as is the case with ordinary food, but then, when the human body gets the required number of calories.