Tuesday, July 27, 2010

NASA Moves Forward on Commercial Partnership for Rocket Engine Testing

Engineers at NASA's John C. Stennis Space Center recently installed an Aerojet AJ26 rocket engine for qualification testing as part of a partnership that highlights the space agency's commitment to work with commercial companies to provide space transportation.

Stennis has partnered with Orbital Sciences Corporation to test the AJ26 engines that will power the first stage of the company's Taurus® II space launch vehicle. Orbital is working in partnership with NASA under the agency's Commercial Orbital Transportation Services (COTS) joint research and development project. The company is under contract with NASA through the Commercial Resupply Services program to provide eight cargo missions to the International Space Station through 2015.

Stennis operators have been modifying their E-1 test facility since April 2009 to test the AJ26 engines for Orbital. Work has included construction of a 27-foot-deep flame deflector trench.


The latest step in the project involved delivery and installation of an AJ26 engine for testing. In upcoming days, operators will perform a series of "chilldown" test, which involves running sub-cooled rocket propellants through the engine, just as will occur during an actual "hotfire" ignition test.

The chilldown tests are used to verify proper temperature conditioning of the engine systems and elapse time required to properly chill the engine, and to measure the quantity of liquid oxygen required to perform the operation.

Once the installed engine passes the chilldown and other qualification tests, it will be removed from the Stennis E-1 test facility. The first actual flight engine then will be delivered and installed for hotfire testing.

Monday, July 26, 2010

NASA is working with the General Electric Co. for Building a Better Plane


Can airplanes be made that use less fuel and create fewer emissions? And can these airplanes also be significantly quieter than the planes currently flying through the skies?

As aeronautics technology continues to evolve, researchers from government agencies and companies around the globe are trying to answer these questions. Developing and testing technology for planes that produce significantly less noise and have a decreased environmental impact are two of NASA's key research goals.


Several NASA centers, including the Glenn Research Center in Cleveland, are involved in a multitude of initiatives regarding new aeronautics technology. Glenn scientists, engineers and researchers are working on several NASA projects and collaborating with the aerospace industry to investigate and test new technologies.

Through a Space Act Agreement, Glenn is working with General Electric Co., or GE, to test new technology for a jet engine with two high-speed propellers on the outside, called an open rotor. This effort, initiated under the Subsonic Fixed Wing Project, and now supported by the Environmentally Responsible Aviation Project, of NASA's Aeronautics Research Mission Directorate in Washington, includes testing of the open rotor technology in a Glenn wind tunnel with a test rig designed and built specifically for open rotor testing.

The test rig allows the one propeller to spin in one direction while the propeller directly behind it spins in the opposite direction. This counter-rotating rig is a special piece of equipment. Very few exist, and none are available commercially in the United States.

"We bring the drive rig and the test capability and the test facility, the 9'X 15' wind tunnel. GE brings the design capability and fabrication capability for the open rotor systems," says Brian Fite, the chief of the Acoustics Branch at Glenn. "It is a good fit of the capabilities that each institution has to offer."

The unique shape and design of the high-speed propeller allows airplanes to fly at speeds close to that of airplanes with jet engines. Jet airplanes currently fly much faster than airplanes with standard propellers. Counter rotating propellers increase fuel efficiency but increase the amount of noise produced. Careful design of the open rotor blades is required to reduce the noise both outside and inside the passenger cabin.


"Our goal is to validate noise reduction with an open rotor system while still getting a good fuel burn performance metric," Fite says. Quieting the noise to make open rotors acceptable to the flying public is a major technical challenge.

NASA studied an earlier form of open rotor technology, called a prop fan, in the late 1980s and early 1990s. The Glenn team began this current testing by reevaluating the center's data from earlier studies to compare it to the new data derived from the new experiments. Some Glenn personnel from the original tests still work at Glenn, and they have provided invaluable input.

Glenn stored the counter rotating test rig from the original experiments, and so Glenn was able to restore it for new use with this testing.

"It had been so long since this kind of technology had been looked at; everyone else got rid of their experimental test capability for it. Glenn stored our rig at Plum Brook, and we were able to refurbish it," says Dale Van Zante, the Environmentally Responsible Aviation Project's open rotor research team lead. "With the run-up in fuel prices over the last few years and other environmental issues, engine companies and air framers are again looking at the open rotor technology. Potentially, it gives a big decrease in fuel burn -- gas mileage gets a lot better with that kind of system."

The open rotor collaboration began with the signing of a Space Act Agreement between NASA and GE in 2008. GE designed and manufactured the rotors that would be tested, while Glenn refurbished, tested and fine-tuned the counter rotating rig. GE then brought its hardware to Glenn, and began testing last summer. The acoustic and performance data generated will help engineers determine how the open rotor concept could be refined.

The testing has been taking place in the 9'x 15' Low-Speed Wind Tunnel at Glenn, with the testing conditions set up to simulate takeoff and landing. Low speed testing is scheduled to be completed in spring 2010. Testing to measure cruise performance, a major contributor to potential fuel burn reduction, is scheduled to be completed by the end of 2010.

The GE blade design is mounted on Glenn’s counter rotating test rig. For each test, the blades' operation is assessed for acoustic and aerodynamic performance. Variables are explored, including the angle of each blade and the geometric relationship between the rows of blades.

Testing will continue with diagnostic assessments such as using pressure sensitive paint on the propeller surface to explore how the aerodynamic pressure, or loading, is distributed on the blades. Glenn purchased a special phased array, or series of microphones, to measure acoustics in this test. The phased array displays sound as color and requires expert application and data processing to get meaningful results.

The project has been intensive -- the experiments in the tunnel have been running in two shifts, meaning testing is going on daily from 9 a.m. to 11 p.m. A crew of about 30 NASA employees has been working with about five GE employees onsite.

"We have had very few issues with the hardware or the data systems. Since we've been up and running, things have gone smoothly with the test," Van Zante says."It's been both fun and challenging, and very dynamic."

Early results are promising, and there is a possibility of testing second generation blade designs with GE after company researchers have analyzed their data. Airplane manufacturers also have expressed interest in using the Glenn wind tunnel and counter rotating rig to test the blade technology with an influence model, which explores changes to the noise and fuel burn reduction when the open rotor is installed on the fuselage of an aircraft.

Both the current tests and the 1980s experiments have measured primarily takeoff and landing noise with minimal cruise noise data. Additional testing planned in the 8' x 6' Supersonic Wind Tunnel at Glenn, where conditions can simulate cruising velocity, will provide data on the high-speed performance of open rotors operating at cruise speeds.

Whatever exciting next steps the inquiry into open rotor calls for, Glenn will be ready with its unique blend of experienced personnel and specialized equipment.

"This drive rig is a unique contribution by NASA Glenn," Brian Fite says. "It's the only drive rig in the country that can do this testing."

Friday, July 23, 2010

Hubble Shows Black Hole Booted Star From Milky Way


Most of the roughly 16 known hypervelocity stars, all discovered since 2005, are thought to be exiles from the heart of our galaxy. But this Hubble result is the first direct observation linking a high-flying star to a galactic center origin.

"Using Hubble, we can for the first time trace back to where the star comes from by measuring the star's direction of motion on the sky. Its motion points directly from the Milky Way center," says astronomer Warren Brown of the Harvard- Smithsonian Center for Astrophysics in Cambridge, Mass., a member of the Hubble team that observed the star. "These exiled stars are rare in the Milky Way's population of 100 billion stars. For every 100 million stars in the galaxy lurks one hypervelocity star."

The movements of these unbound stars could reveal the shape of the dark matter distribution surrounding our galaxy. "Studying these stars could provide more clues about the nature of some of the universe's unseen mass, and it could help astronomers better understand how galaxies form," says team leader Oleg Gnedin of the University of Michigan in Ann Arbor. "Dark matter's gravitational pull is measured by the shape of the hyperfast stars' trajectories out of the Milky Way."

The stellar outcast is already cruising in the Milky Way's distant outskirts, high above the galaxy's disk, about 200,000 light-years from the center. By comparison, the diameter of the Milky Way's disk is approximately 100,000 light- years. Using Hubble to measure the runaway star's direction of motion and determine the Milky Way's core as its starting point, Brown and Gnedin's team calculated how fast the star had to have been ejected to reach its current location.

"The star is traveling at an absurd velocity, twice as much as the star needs to escape the galaxy's gravitational field," explains Brown, a hypervelocity star hunter who found the first unbound star in 2005. "There is no star that travels that quickly under normal circumstances-something exotic has to happen."

There's another twist to this story. Based on the speed and position of HE 0437- 5439, the star would have to be 100 million years old to have journeyed from the Milky Way's core. Yet its mass - nine times that of our Sun - and blue color mean that it should have burned out after only 20 million years - far shorter than the transit time it took to get to its current location.

The most likely explanation for the star's blue color and extreme speed is that it was part of a triple-star system that was involved in a gravitational billiard-ball game with the galaxy's monster black hole. This concept for imparting an escape velocity on stars was first proposed in 1988. The theory predicted that the Milky Way's black hole should eject a star about once every 100,000 years.

A hundred million years ago, a triple-star system was traveling through the bustling center of our Milky Way galaxy when it made a life-changing misstep. The trio wandered too close to the galaxy's giant black hole, which captured one of the stars and hurled the other two out of the Milky Way. Adding to the stellar game of musical chairs, the two outbound stars merged to form a super- hot, blue star.

This story may seem like science fiction, but astronomers using NASA's Hubble Space Telescope say it is the most likely scenario for a so-called hypervelocity star, known as HE 0437-5439, one of the fastest ever detected. It is blazing across space at a speed of 1.6 million miles (2.5 million kilometers) an hour, three times faster than our Sun's orbital velocity in the Milky Way. Hubble observations confirm that the stellar speedster hails from the Milky Way's core, settling some confusion over where it originally called home.

Brown suggests that the triple-star system contained a pair of closely orbiting stars and a third outer member also gravitationally tied to the group. The black hole pulled the outer star away from the tight binary system. The doomed star's momentum was transferred to the stellar twosome, boosting the duo to escape velocity from the galaxy. As the pair rocketed away, they went on with normal stellar evolution. The more massive companion evolved more quickly, puffing up to become a red giant. It enveloped its partner, and the two stars spiraled together, merging into one superstar - a blue straggler.

"While the blue straggler story may seem odd, you do see them in the Milky Way, and most stars are in multiple systems," Brown says.

This vagabond star has puzzled astronomers since its discovery in 2005 by the Hamburg/European Southern Observatory sky survey. Astronomers had proposed two possibilities to solve the age problem. The star either dipped into the Fountain of Youth by becoming a blue straggler, or it was flung out of the Large Magellanic Cloud, a neighboring galaxy.

In 2008 a team of astronomers thought they had solved the mystery. They found a match between the exiled star's chemical makeup and the characteristics of stars in the Large Magellanic Cloud. The rogue star's position also is close to the neighboring galaxy, only 65,000 light-years away. The new Hubble result settles the debate over the star's birthplace.

Astronomers used the sharp vision of Hubble's Advanced Camera for Surveys to make two separate observations of the wayward star 3 1/2 years apart. Team member Jay Anderson of the Space Telescope Science Institute in Baltimore, Md., developed a technique to measure the star's position relative to each of 11 distant background galaxies, which form a reference frame.

Anderson then compared the star's position in images taken in 2006 with those taken in 2009 to calculate how far the star moved against the background galaxies. The star appeared to move, but only by 0.04 of a pixel (picture element) against the sky background. "Hubble excels with this type of measurement," Anderson says. "This observation would be challenging to do from the ground."

The team is trying to determine the homes of four other unbound stars, all located on the fringes of the Milky Way.

"We are targeting massive 'B' stars, like HE 0437-5439," says Brown, who has discovered 14 of the 16 known hypervelocity stars. "These stars shouldn't live long enough to reach the distant outskirts of the Milky Way, so we shouldn't expect to find them there. The density of stars in the outer region is much less than in the core, so we have a better chance to find these unusual objects."

The results were published online in The Astrophysical Journal Letters on July 20, 2010. Brown is the paper's lead author.

The Hubble Space Telescope is a project of international cooperation between NASA and the European Space Agency. NASA's Goddard Space Flight Center manages the telescope. The Space Telescope Science Institute (STScI) conducts Hubble science operations. STScI is operated for NASA by the Association of Universities for Research in Astronomy, Inc. in Washington, D.C.

Wednesday, July 21, 2010

Endangered baby sea Turtles get Helping Hand


Dozens of endangered baby sea turtles are having an unusual birthday at NASA's Kennedy Space Center in Florida with the help of dozens of officials and volunteers who rescued them from the oil spill-threatened northeastern shores of the Gulf of Mexico.

Two groups of Kemp's ridley sea turtle hatchlings have been released onto Atlantic Ocean beaches so far as the unprecedented operation to save sea turtle nests ramps up.

The release and relocation work is part of the environmental endeavor by the U.S. Fish and Wildlife Service, the Florida Fish and Wildlife Conservation Commission, the National Park Service, NOAA, FedEx and conservationists to help minimize the risk to this year's sea turtle hatchlings from impacts of the oil spill. During the next several months, this plan involves carefully moving an anticipated 700 nests to Kennedy that have been laid on Florida Panhandle and Alabama beaches.

The hatchlings are part of the same populations of species that lay their eggs on Atlantic and Gulf coast. The endangered species include loggerhead turtles, but nests from leatherback and green turtles, in addition to Kemp's ridley, may be brought to the Kennedy hatchery. In all, about 50,000 hatchlings normally start their lives on Gulf coast beaches during hatching season.

FedEx is providing a specialized truck to ferry the sensitive eggs to Kennedy.

The Merritt Island National Wildlife Refuge was established in 1963 as an overlay of Kennedy Space Center, where it shares the land with space shuttle launch pads, rockets and research and development facilities. In addition, Canaveral National Seashore is partially in Kennedy also, and together these agencies provide more 24 miles of sea turtle nesting beach habitat.


"We are home to many species of protected wildlife and we hope to provide these sea turtles with a better chance of survival," said Kennedy Center Director Bob Cabana.

From hand-picking each egg from beaches on the northern Gulf of Mexico, to trucking them in custom vehicles from FedEx to incubating them at Kennedy and finally releasing them to crawl into the surf, the work is as about as delicate as it gets.

Left on their own, the turtles likely would have been helpless against the effects of the BP Deepwater Horizon oil spill, wildlife officials said.

"We understand that significant risks remain, but the option of allowing tens of thousands of turtle hatchlings to crawl into oiled waters of the northern Gulf of Mexico is not acceptable," said Rodney Barreto, chairman of the Florida Fish and Wildlife Conservation Commission.

Jane Provancha, the lead biologist and environmental project manager for Innovative Health Applications (IHA) at the Kennedy hatchery, said the first nest was received June 26, about 10 days after expert opinions concluded that the relocation was viable and necessary. Eight nests have been received so far, but more than 700 are expected to be relocated by the time the operation closes at the end of hatching season in October.

"We were asked by USFWS and NOAA and we recommitted to support this action through the whole nesting season," Provancha said. "The plan is to move them all because the Gulf is in extremely bad condition."

At this point, the incubation process is producing normal amounts of hatchlings from the nests, Provancha said. "The process of incubation is actually quite simple."

The hard part is any physical movement of the eggs. For instance, the nests cannot be dug up in one scoop by an excavator. Instead, each egg is handled by a trained and authorized official who keeps the egg oriented exactly as it was in its nest. A single nest can hold 100 or more eggs.

The eggs and sand are placed inside Styrofoam containers and brought in the FedEx truck to Kennedy, where they are housed in comfortable darkness and relative quiet in a climate-controlled building. The nests are monitored on a daily basis for temperature and for signs of hatching. Coordination with the Gulf coast biologists on nest arrivals, tracking nest status, and the biologists say releases will be a continual challenge.

After they hatch, the tiny sea turtles, about half the size of an outstretched hand, are carried to different points along a 100-mile stretch of the Atlantic shore and set free. Typically, sea turtles make their way offshore to a developmental habitat where they can eat and grow.

With the first round of releases behind them with positive results, the people involved in the project are learning what works and what can change as they continue to relocate more nests.

"This option I hadn't really thought about, but it does make sense," Provancha said.

Tuesday, July 20, 2010

Making Home a Safer Place


One day homeowners everywhere may be protected from deadly carbon monoxide fumes, thanks to a device invented at NASA's Langley Research Center. The device uses a new class of low-temperature oxidation catalysts to convert carbon monoxide to non-toxic carbon dioxide at room temperature and also removes formaldehyde from the air. The catalysts initially were developed for research involving carbon dioxide lasers.

Saturday, July 17, 2010

NASA's Wide-field Infrared Survey Explorer Mission to Complete Extensive Sky Survey


NASA's Wide-field Infrared Survey Explorer, or WISE, will complete its first survey of the entire sky on July 17, 2010. The mission has generated more than one million images so far, of everything from asteroids to distant galaxies.

"Like a globe-trotting shutterbug, WISE has completed a world tour with 1.3 million slides covering the whole sky," said Edward Wright, the principal investigator of the mission at the University of California, Los Angeles.

Some of these images have been processed and stitched together into a new picture being released today. It shows the Pleiades cluster of stars, also known as the Seven Sisters, resting in a tangled bed of wispy dust. The pictured region covers seven square degrees, or an area equivalent to 35 full moons, highlighting the telescope's ability to take wide shots of vast regions of space.

The new picture was taken in February. It shows infrared light from WISE's four detectors in a range of wavelengths. This infrared view highlights the region's expansive dust cloud, through which the Seven Sisters and other stars in the cluster are passing. Infrared light also reveals the smaller and cooler stars of the family.

To view the new image, as well as previously released WISE images, visit http://www.nasa.gov/wise and http://wise.astro.ucla.edu .

"The WISE all-sky survey is helping us sift through the immense and diverse population of celestial objects," said Hashima Hasan, WISE Program scientist at NASA Headquarters in Washington. "It's a great example of the high impact science that's possible from NASA's Explorer Program."

The first release of WISE data, covering about 80 percent of the sky, will be delivered to the astronomical community in May of next year. The mission scanned strips of the sky as it orbited around the Earth's poles since its launch last December. WISE always stays over the Earth's day-night line. As the Earth moves around the sun, new slices of sky come into the telescope's field of view. It has taken six months, or the amount of time for Earth to travel halfway around the sun, for the mission to complete one full scan of the entire sky.

For the next three months, the mission will map half of the sky again. This will enhance the telescope's data, revealing more hidden asteroids, stars and galaxies. The mapping will give astronomers a look at what's changed in the sky. The mission will end when the instrument's block of solid hydrogen coolant, needed to chill its infrared detectors, runs out.

"The eyes of WISE have not blinked since launch," said William Irace, the mission's project manager at NASA's Jet Propulsion Laboratory in Pasadena, Calif. "Both our telescope and spacecraft have performed flawlessly and have imaged every corner of our universe, just as we planned."

So far, WISE has observed more than 100,000 asteroids, both known and previously unseen. Most of these space rocks are in the main belt between Mars and Jupiter. However, some are near-Earth objects, asteroids and comets with orbits that pass relatively close to Earth. WISE has discovered more than 90 of these new near-Earth objects. The infrared telescope is also good at spotting comets that orbit far from Earth and has discovered more than a dozen of these so far.

WISE's infrared vision also gives it a unique ability to pick up the glow of cool stars, called brown dwarfs, in addition to distant galaxies bursting with light and energy. These galaxies are called ultra-luminous infrared galaxies. WISE can see the brightest of them.

Tuesday, July 13, 2010

Study Finds Amazon a single, huge, violent Storm Killed Half a Billion Trees

A single, huge, violent storm that swept across the whole Amazon forest in 2005 killed half a billion trees, according to a new study funded by NASA and Tulane University, New Orleans.

While storms have long been recognized as a cause of Amazon tree loss, this study is the first to actually quantify losses from a storm. And the losses are much greater than previously suspected, say the study's authors, which include research scientist Sassan Saatchi of NASA’s Jet Propulsion Laboratory, Pasadena, Calif. The work suggests that storms may play a larger role in the dynamics of Amazon forests than previously recognized, they add.

Previous research had attributed a peak in tree mortality in 2005 solely to a severe drought that affected parts of the forest. The new study says that a single squall line (a long line of severe thunderstorms, the kind associated with lightning and heavy rainfall) had an important role in the tree demise. Research suggests this type of storm might become more frequent in the future in the Amazon due to climate change, killing a higher number of trees and releasing more carbon to the atmosphere.

Tropical thunderstorms have long been suspected of wreaking havoc in the Amazon, but this is the first time researchers have calculated how many trees a single thunderstorm can kill, says Jeffrey Chambers, a forest ecologist at Tulane University and one of the authors of the paper. The paper has been accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union.

Previous studies by a coauthor of this new paper, Niro Higuchi of Brazil's National Institute for Amazon Research (INPA), showed the 2005 tree mortality spike was the second largest recorded since 1989 for the Manaus region in the Central Amazon. Also in 2005, large parts of the Amazon forest experienced one of the harshest droughts of the last century. A study published in the journal Science in 2009 pointed to the drought as the single agent for a basin-wide increase in tree mortality. But a very large area with major tree loss (the region near Manaus) was not affected by the drought.

"We can't attribute [the increased] mortality to just drought in certain parts of the basin--we have solid evidence that there was a strong storm that killed a lot of trees over a large part of the Amazon," Chambers says.

From Jan. 16 to 18, 2005, a squall line 1,000 kilometers (620 miles) long and 200 kilometers (124 miles) wide crossed the whole Amazon basin from southwest to northeast, causing several human deaths in the cities of Manaus, Manacaparu, and Santarem. The strong vertical winds associated with the storm, blowing up to 145 kilometers per hour (90 miles per hour), uprooted or snapped in half trees that were in their path. In many cases, the stricken trees took down some of their neighbors when they fell.

The researchers used a combination of Landsat satellite images, field-measured tree mortality, and modeling to determine the number of trees killed by the storm. By linking satellite data to observations on the ground, the researchers were able to take into account smaller tree blowdowns (less than 10 trees) that otherwise cannot be detected through satellite images.

Looking at satellite images for the area of Manaus from before and after the storm, the researchers detected changes in the reflectivity of the forest, which they suspected were indicative of tree losses. Undisturbed forest patches appeared as closed, green canopy in satellite images. When trees die and fall, a clearing opens, exposing wood, dead vegetation, and surface litter. This so-called "woody signal" only lasts for about a year in the Amazon. In a year, vegetation re-grows and covers the exposed wood and soil. This means the signal is a good indicator of recent tree deaths.

After seeing disturbances in the satellite images, the researchers established five field sites in one of the blowdown areas, and counted the number of trees that had been killed by the storm; researchers can usually tell what killed a tree from looking at it.

"If a tree dies from a drought, it generally dies standing. It looks very different from trees that die snapped by a storm," Chambers says.

In the most affected plots, near the centers of large blowdowns, up to 80 percent of the trees had been killed by the storm.

By comparing their field data and the satellite observations, the researchers determined that the satellite images were accurately pinpointing areas of tree death, and they calculated that the storm had killed between 300,000 and 500,000 trees in the area of Manaus. The number of trees killed by the 2005 storm is equivalent to 30 percent of the annual deforestation in that same year for the Manaus region, which experiences relatively low rates of deforestation.

The team then extrapolated the results to the whole Amazon basin.

"We know that the storm was intense and went across the basin," Chambers says. "To quantify the potential basin-wide impact, we assumed that the whole area impacted by the storm had a similar level of tree mortality as the mortality observed in Manaus."

The researchers estimate that between 441 and 663 million trees were destroyed across the whole basin. This represents a loss equivalent to 23 percent of the estimated mean annual carbon accumulation of the Amazon forest.

Squall lines that move from southwest to northeast of the forest, like the one in January 2005, are relatively rare and poorly studied, says Robinson Negron-Juarez, an atmospheric scientist at Tulane University, and lead author of the study. Storms that are similarly destructive but advance in the opposite direction (from the northeast coast of South America to the interior of the continent) occur up to four times per month. They can also generate large forest blowdowns (contiguous patches of wind-toppled trees), although it's infrequent that either of these two types of storms crosses the whole Amazon.

"We need to start measuring the forest perturbation caused by both types of squall lines, not only by the ones coming from the south," Negron-Juarez says. "We need that data to estimate total biomass loss from these natural events, which has never been quantified."

Chambers says that authors of previous studies on tree mortality in the Amazon have diligently collected dead-tree tolls, but information on exactly what killed the trees is often lacking, or not reported.

"It's very important that when we collect data in the field, we do forensics on tree mortality," says Chambers, who has been studying forest ecology and carbon cycling in the Amazon since 1993. "Under a changing climate, some forecasts say that storms will increase in intensity. If we start seeing increases in tree mortality, we need to be able to say what's killing the trees."

Wednesday, July 07, 2010

ESA’s Planck mission Unveils the Universe -- Now and Then

A new image from the Planck mission shows what it's been up to for the past year -- surveying the entire sky for clues to our universal origins. Planck, a European Space Agency mission with significant participation from NASA, has been busily scanning the whole sky at nine frequencies of light, with the ultimate goal of isolating fluctuations in the cosmic microwave background -- or light from the beginning of time. These fluctuations represent the seeds from which structure in our universe evolved.

"This image shows both our Milky Way galaxy and the universe 380,000 years after the Big Bang in one expansive view," said Charles Lawrence, the NASA project scientist for the mission at the Jet Propulsion Laboratory in Pasadena, Calif. "The radiation from the Milky Way traveled hundreds or thousands of years to reach us, while the radiation from the early universe traveled 13.7 billion years to reach us. What we see in this picture happened at very different times."

The picture has been color-coded to show how the sky looks over the range of frequencies observed by Planck. Planck detects light that we can't see with our eyes -- light with low frequencies ranging from 30 to 857 gigahertz. The disk of the Milky Way galaxy, seen edge-on from Earth's perspective, is the bright band running horizontally down the middle. Diffuse, huge clouds of gas and dust relatively close to us in our galaxy can be seen above and below this band. The cosmic microwave background is apparent as the grainy structure towards the top and bottom of the image.

Scientists want to study this grainy signature across the entire sky, which means seeing through the "fog" of our Milky Way. The Planck teams are busy now removing this foreground fog, a meticulous process akin to identifying and removing all the hay in a haystack to reveal the needle within. The process will take about two more years, with the first processed data being released to the scientific community toward the end of 2012. The U.S. Planck team is helping with this task, with a primary tool being the Franklin supercomputer at the National Energy Research Scientific Computing Center in Berkeley, Calif. One of the world's fastest computers, Franklin will handle the most computationally intensive analysis jobs for the Planck team worldwide.

Meanwhile, this fog is not something to discard. It contains a treasure trove of information about our galaxy and its structure, in addition to many other galaxies. The U.S. Planck team is responsible for releasing the first batch of this astronomy data, called the Early Release Compact Source Catalogue, an event schedule for January 2011.

Planck will continue surveying the sky until at least the end of January 2012, completing almost five all-sky scans.

Monday, July 05, 2010

The Russian Progress Resupply Craft Successfully Docks to Space Station


The ISS Progress 38 cargo resupply ship successfully docked to the aft end of the International Space Station’s Zvezda service module at 12:17 p.m. EDT Sunday. The docking was executed flawlessly by Progress’ Kurs automated rendezvous system.

The Progress spacecraft carries 1,918 pounds of propellant, 110 pounds of oxygen, 220 pounds of water and 2,667 pounds of experiment equipment, spare parts and other supplies to the station. It launched from the Baikonur Cosmodrome in Kazakhstan on June 30. An attempted docking Friday, July 2, was aborted when telemetry between the Progress and the space station was lost about 25 minutes before its planned docking.

The most likely cause of Friday’s aborted docking was traced to the activation of the TORU “Klest” TV transmitter, which created interference with TORU itself, causing a loss of the TORU command link between Progress and the International Space Station that triggered the abort of the Progress docking. TORU was not activated for today’s docking. The TORU TV system is designed to provide a view of Zvezda's docking target to station Commander Alexander Skvorstov, if he had to operate a joystick in the service module to dock Progress manually.

The Expedition 24 crew members monitored the arrival of the spacecraft. The crew will enjoy an off-duty day Monday in observance of the U.S. Independence Day holiday.

Friday, July 02, 2010

New Launch Dates Announced for Space Shuttle Flights STS-133 and STS-134

Space shuttle Discovery's STS-133 mission is now targeted for launch Nov. 1 at about 4:33 p.m. EDT. Endeavour's STS-134 mission is targeted for liftoff on Feb. 26, 2011, at about 4:19 p.m. EST. The target dates were adjusted because critical payload hardware for the STS-133 mission will not be ready in time for the previously targeted date. With Discovery's move, Endeavour had to plan for its next available window, which was February.

During space shuttle Discovery's final spaceflight, the STS-133 crew members will take important spares to the International Space Station along with the Express Logistics Carrier 4. Discovery is being readied for flight inside Kennedy's orbiter processing facility while its solid rocket boosters are stacked inside the nearby Vehicle Assembly Building. STS-133 is slated to launch in September.