Connect with us

Science

Space lasers! – The Verge

Published

on


It’s easy to take communication infrastructure for granted, right up until the moment you need to make an important call and don’t have cell service. But if you think that’s bad on the ground, then imagine how much worse the problem is in space.

NASA spacecraft communicate with engineers and scientists on Earth primarily using a system called the Deep Space Network, or DSN, which, for the past 60 years, has been responsible for sending vital instructions to spacecraft and receiving precious data back from them.

This network is aging, however, with decades-old hardware that’s struggling to carry an ever-increasing load. A growing number of missions is pushing the network to its limits, and already, thousands of hours of science observations are being lost during big missions like Artemis I. 

There’s no way for the current system of radio communications to meet the needs of NASA’s ambitious future plans, like sending out more deep space missions and putting humans on the Moon. If we want reliable, high-bandwidth communications to and from space to enable future exploration, experts say we need something new. It’s time to turn to lasers.

A global network for space

The DSN hardware consists of multiple dishes, or antennas, located at three sites around the globe in Goldstone, California; Madrid, Spain; and Canberra, Australia. These sites were chosen so that however the Earth rotates relative to a spacecraft in deep space, at least one site will be able to pick up its communications.

It’s time to turn to lasers

The DSN is currently used by more than 40 missions, not only from NASA but also from international partners like the European and South Korean space agencies. The network carries data from spacecraft as distant as the Voyager probes, currently exploring interstellar space, and the New Horizons mission, which is out beyond the orbit of Pluto. It also includes major science missions like the Mars rovers Perseverance and Curiosity and even telescopes like the James Webb Space Telescope. 

You can get a feel for how many missions are passing data back and forth from Earth on the DSN by looking at the DSN Now tool, which shows which antennas are receiving data from which missions in real time. New instruments are hoovering up more data than ever before: James Webb, for example, collects and transmits data at around 50 times the rate of the older Hubble Space Telescope. All of that data needs to be transmitted back to Earth, and it’s putting a strain on the system.

Overburdened and at a critical point

In fact, it’s fair to say that it’s too much data for the network to handle. The DSN is oversubscribed, meaning more missions need to use it than can be accommodated, with the demand from space missions being as much as 40 percent above what the network can provide. NASA officials and committee members have described reaching a “critical point” and a “five-alarm fire bell.” 

This isn’t a problem with an easy solution. An audit of the DSN performed by the NASA Office of Inspector General earlier this year found that demand would reach up to 50 percent higher than could be supplied by the 2030s.

The DSN is oversubscribed

In a recent meeting of the Space Studies Board on November 14th, the NASA official who oversees the agency’s space communications and navigation program, Jeff Volosin, acknowledged the challenges of balancing competing needs on the DSN, such as trying to maintain data communications from science missions like James Webb during the Artemis I mission

“The need to cover that Artemis mission with our 34-meter [deep space] antennas did affect our ability to do science mission support at the same time,” Volosin said. That effectively means lost time on missions like Webb, with hours of observations that can’t be made because the data can’t be transmitted.

The problem is only going to get worse in the future, as NASA ramps up its plans for missions to the Moon, including sending crew there. The DSN is dealing with a declining budget and, looking ahead to future lunar missions over the next 10 to 15 years, “There’s going to be a challenge where we’re not always going to have capacity,” Volosin said. 

There’s also the fact that much of the DSN hardware, some of which was built in the 1960s, is aging and has suffered from years of deferred maintenance. But experts argue that the DSN needs to be recognized as critical infrastructure, without which space exploration as we know it would be impossible.

“It’s a jewel of humanity,” said Jason Mitchell, program executive for NASA’s Space Communications and Navigation (SCaN) program. “You think about what we have been able to accomplish and develop in terms of understanding of our universe and our place in it — it is undeniably a critical element in that. It’s difficult for me to articulate how important I think this is as an asset to humanity.”

Image: Ricardo Rubio / Europa Press via Getty Images

Increasing communications capacity

There is hope that some of these capacity issues could be alleviated by making use of commercial operations offering communications services, with NASA exploring the possibility of using options like SpaceX’s Starlink network for low Earth orbit communications (though it’s not certain how long the government will maintain positive relations with Elon Musk’s SpaceX given his antisemitism). 

For deep space missions, however, government-run facilities are the only realistic option. NASA is building a lunar communication system called Lunar Exploration Ground Sites, or LEGS, consisting of 18-meter antennas for use in Moon missions. And the DSN is in the process of upgrading to six 34-meter antennas, with two at each of the three sites, although these upgrades are currently several years behind the initial schedule. 

There are also some clever ways of making the most of the current DSN system. With multiple Mars missions, for example, it’s possible to have three or four different rovers or orbiters transmitting data that is received by just one antenna because each mission operates on its own channel. There are also experiments within the DSN into offloading tasks like GPS and timing, which could help reduce the overall load on the network. 

In the longer term, though, higher demands are going to require a new approach to communications generally. On a tight budget, it’s not easy to find the money for this kind of experimentation. But it’s necessary for supporting the long-term operations of the agency. “It’s a challenge because you’re balancing your dollar needs of today against the potential dollar needs for the future,” Mitchell said.

Looking to the long term

To get more out of a communications system, you need to squeeze more bandwidth into a signal, and the way to do that is to operate at a higher frequency. That’s the idea of using laser communications, also known as optical communications, in place of radio waves. These transmit in the near-infrared portion of the electromagnetic spectrum, passing along data that is encoded into particles of light, called photons. That can increase the bandwidth available by an order of magnitude compared to using radio.

This would help ease the strain on the DSN to convey data for more missions. “Really the only way, the nearest-term capability that we have to meet this capacity that’s currently being fielded, is optical,” Mitchell said. “Even when you think about the peak scenarios where you have a high-priority mission set in the same part of the sky, it’s less of a burden because now you can get that same amount of data down in less time and still service all those missions.”

Laser communication has different hardware requirements from radio communication, though, so in the last decade, NASA has begun to develop demonstration systems that can test out this new capability. For the DSN, the ideal scenario is to upgrade the existing hardware to handle both radio and laser as required.

“We have these big 34-meter antennas for the DSN, and they already have all the capabilities to move and point. We’ve got really accurate pointing. So our goal is to add the capabilities of optical so we can do simultaneous RF [radio frequency] and optical comm,” Amy Smith, the deputy manager for the DSN at NASA’s Jet Propulsion Laboratory, said. 

The current DSN infrastructure already exists for functions like moving the antennas and data routing. That could make adapting existing hardware to use optical rather than building new facilities considerably cheaper. “We think it’s maybe half the cost of building a standalone optical receiver,” Smith said. 

Designing a hybrid system

Adapting an antenna designed for radio for optical use isn’t as simple as slapping a new box onto a big dish, though. The planned upgrades would work by adding a series of actuated glass mirrors to the center of an antenna, which can move to make the tiny accurate adjustments necessary for laser communications. 

These mirrors bounce the incoming light toward the top of the antenna, called the apex, into a receiver from which the signal can be routed through the pedestal of the dish and onto its destination.

The additional hardware for laser comms would eat up some of the dish currently dedicated to radio comms but only a small amount right in the center — so there is a small impact on the use of that antenna for radio. 

But the big advantage of this hybrid approach is that it allows the use of radio and laser comms simultaneously. “Having the capability of both gives you one pass, where you could be talking with RF on your uplink, and then getting a high-rate science downlink at the same time with just a single antenna,” Smith explained.

These upgrades would allow a significant leap in communications bandwidth for the network. Using optical over radio would allow around 60 times more data to be transferred from the distance of Mars. That would be hugely preferable for future crewed missions, which might want to send back video of astronauts at work similar to the footage seen from the International Space Station as well as for science missions with increasingly complex instruments.

Testing out a new system in deep space

Laser communications are frequently used and well understood on Earth and in near-Earth scenarios — and theoretically, the systems should be perfectly capable of operating in deep space as well. But theory is one thing, and getting a fully reliable system up and in use is quite another.

For the past few years, NASA has been experimenting with using small-scale demonstrations of optical communications technology for missions at the International Space Station and the Moon. This year, the agency launched its most long-range optical comms test yet, with a demo called Deep Space Optical Communications, or DSOC, which is traveling along with the Psyche spacecraft, launched in October.

DSOC is essentially a small telescope attached to the spacecraft that can send and receive optical data. It will be turned on once per week and will transmit data as the spacecraft travels through the Solar System toward an asteroid in the main belt, testing whether the signal can be picked up at the Palomar Observatory in California.

On November 14th, DSOC was able to lock onto a laser signal transmitted from Earth and send data back along the download for the first time, while the spacecraft was located nearly 10 million miles from Earth. The testing will continue for the next two years, working at distances of up to 240 million miles, aiming for data transmission rates that are 10 to 100 times greater than using radio.

The biggest challenge at these distances is getting the spacecraft and receiver correctly aligned, which is technically known as pointing. While radio transmissions spread out over a larger area, the nature of a laser is that it will create a narrower target, so it needs to line up with the receiver much more accurately. Even slight wobbles of the spacecraft could send the beam miles off course, so each transmission requires a laser signal from the ground that the spacecraft transmitter can lock onto.

And because of the time that light takes to travel, pointing gets harder over time as the distance between the spacecraft and Earth increases. The DSOC team has modeled how to account for this increased point-ahead angle, but no one has tried using laser communications over these kinds of distances before. 

“Every time you try to do something new, there’s things you haven’t anticipated or haven’t designed for,” Abi Biswas, project technologist for DSOC at NASA’s Jet Propulsion Laboratory, said. 

Photo by CHANDAN KHANNA/AFP via Getty Images

The challenges of optical communications

If the DSOC experiment goes perfectly, it will still likely be 10 to 15 years before optical communications will be ready for mainstream use. That’s because one of the biggest concerns for space missions is reliability, and radio has been reliably in use for decades. But even if optical can be used reliably, it still won’t be a complete replacement for radio.

Optical has its own drawbacks, despite its big advantage of higher bandwidths. The most immediate concern is cloud cover, as laser beams can’t penetrate layers of cloud in the Earth’s atmosphere. Receiving sites need to be located in regions with good weather, but even then, agencies would still want radio communications to be available as a backup.

“As this technology matures — going into the 50 to 100 year timespan, as we gain more experience deploying things in space — the best place to put the receiver would be in space,” Biswas said. That would put the receiver above the clouds and allow data to be sent from orbit to space more easily. 

Another issue is that if there is any kind of problem with a spacecraft, it will often go into a safe mode in which it performs only essential functions to prevent any further errors or damage. During this time, the spacecraft can be tumbling as it loses its pointing functions. Radio communications can handle that situation because of its broad beam that can transmit in multiple directions, allowing communication with the spacecraft.

But even if optical can be used reliably, it still won’t be a complete replacement for radio.

With optical, however, a very narrow beam is transmitted in just one direction. If a craft is tumbling, laser communications would be lost. This could be dealt with in the future using optical transmitters that are actuated to move and send signals in multiple directions, but this technology hasn’t been developed yet. 

These challenges are potentially solvable, but in the medium term, the most realistic outlook would be to continue using the tried and tested radio system for spacecraft telemetry but supplement this with a high-bandwidth optical system for science data.

If you build it, will they come?

There’s huge potential in optical communications, but explorations are still in the early stages. Being able to transmit 10 or 100 times as much data as radio sounds ideal, but it’s impossible to know how reliable these systems will be until they’ve been tested in real-world scenarios for long periods of time.

“I’m the first to say — we don’t have a lot of long-term data yet,” Volosin, the NASA communications head, said at the meeting. “How do these laser systems look 5, 10, 15 years into a mission? Nobody’s collected that data. So we’re learning a lot. But for specific science missions, these could be game changers.”

The other aspect of the work of the DSN in particular is that it is essentially a service network for the spacecraft missions. It aims to provide communications services, but each future mission can choose if they want to use optical or not. 

“We do think that once this is proven and people see how much data you can get through an optical comm system, it’s going to be really popular,” Smith said. After all, it’s the most promising option for being able to transmit the large quantities of data that future missions will surely produce. “As all of the technology gets more sophisticated, we’re able to create just boatloads of data. And scientists will always want more data,” said Smith.

A long, gradual process of upgrading its communications network might not be the most head-grabbing aspect of NASA’s work, but it’s the kind of investment that’s crucial to enable scientific discoveries and human exploration.

“That not flashy infrastructure part turns out to be critical,” Mitchell said. “Our ability to deliver data directly impacts the discoveries that we’re able to make and the knowledge that humanity is able to generate to understand the universe and our place in it.”



Source link

Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Science

Jeff Bezos’ Blue Origin plans to launch a new crew capsule on Monday

Published

on

By


Blue Origin is preparing to launch its NS-27 mission with the RSS Kármán Line, its new crew capsule, on Monday at 9AM ET. It will be the first launch for the capsule, which the company says in its announcement will have improved performance and reusability, along with “an updated livery, and accommodations for payloads on the booster.”

The flight will carry two LIDAR sensors into space that will be used for Blue Origin’s Lunar Permanence program to develop Moon landers. Those are among 12 payloads that also include ultra-wideband proximity operations sensors, a reproduction of the black monoliths from 2001: A Space Odyssey, and student postcards submitted to its Club for the Future nonprofit. Blue Origin will stream the launch on its website, starting 15 minutes before liftoff.

NS-27’s next flight comes as Blue Origin works toward the goal of becoming a real SpaceX competitor. Company CEO Dave Limp, the former Amazon hardware boss who took over late last year, said the company needs to “be able to build things a lot” to become “a world class manufacturer” in an interview with CNBC.

“We’d like to [be delivering] about an engine a week by the end of the year. I’m not sure we’ll get exactly to a week, but it’ll be sub-10 days … [and] by the end of 2025, we have to be faster than that,” Limp said.

Blue Origin plans to launch New Glenn, its big reusable booster that recently completed its first second-stage hot fire test, for the first time in November. Blue Origin says the rocket can deliver 45,000 kilograms (more than 99,000 pounds) into low Earth orbit, which CNBC notes is roughly double what SpaceX’s Falcon 9 can do. The company also hopes to land the booster on its first flight.



Source link

Continue Reading

Science

Big Tech has cozied up to nuclear energy

Published

on

By


Tech giants are increasingly eyeing nuclear reactors to power their energy-hungry data centers. Amazon and Microsoft each inked major deals this year with nuclear power plants in the US. And both Microsoft and Google have shown interest in next-generation small modular reactors that are still in development.

New AI data centers need a lot of electricity, which has taken companies further away from their climate goals as their carbon emissions grow. Nuclear reactors could potentially solve both of those problems. As a result, Big Tech is breathing new life into America’s aging fleet of nuclear reactors while also throwing its weight behind emerging nuclear technologies that have yet to prove themselves.

“Certainly, the prospects for this industry are brighter today than they were five and 10 years ago,” says Mark Morey, senior adviser for electricity analysis at the US Department of Energy’s Energy Information Administration.

“Certainly, the prospects for this industry are brighter today”

Much of America’s aging nuclear fleet came online in the 1970s and 1980s. But the industry has faced pushback following high-profile accidents like Three Mile Island and the Fukushima disaster in Japan. Nuclear power plants are also expensive to build and generally less flexible than gas plants that now make up the biggest chunk of the US electricity mix. Gas-fired power plants can more quickly ramp up and down with the ebb and flow of electricity demand.

Nuclear power plants typically provide steady “baseload” power. And that makes it an attractive power source for data centers. Unlike manufacturing or other industries that operate during daytime business hours, data centers run around the clock.

“When people are sleeping and offices are shut and we’re not using as much [electricity], what matches nuclear energy very nicely with data centers is that they pretty much need power 24/7,” Morey says.

That consistency also sets nuclear apart from wind and solar power that wane with the weather or time of day. Over the past five years or so, many tech companies have accelerated climate goals, pledging to reach net zero carbon dioxide emissions.

The added energy demand from new AI tools, however, has put those goals further out of reach in some cases. Microsoft, Google, and Amazon have all seen their greenhouse gas emissions climb in recent years. Getting electricity from nuclear reactors is one way companies can try to bring those carbon emissions down.

A feat that’s never been done before in the US

Microsoft signed an agreement to purchase power from shuttered Three Mile Island in September. “This agreement is a major milestone in Microsoft’s efforts to help decarbonize the grid in support of our commitment to become carbon negative,” Microsoft VP of energy Bobby Hollis said in a press release at the time.

The plan is to revive the plant by 2028, a feat that’s never been done before in the US. The plant “was prematurely shuttered due to poor economics” in 2019, according to Joe Dominguez, president and CEO of the company, Constellation, that owns the plant. But the outlook for nuclear energy now is rosier than it has been for years as companies look for carbon pollution-free sources of electricity.

In March, Amazon Web Services purchased a data center campus powered by the adjacent Susquehanna Nuclear power plant in Pennsylvania. That $650 million deal secures electricity from the sixth largest nuclear facility in the US (out of 54 sites today).

Google is considering procuring nuclear energy for its data centers as part of its sustainability plans. “Obviously, the trajectory of AI investments has added to the scale of the task needed,” CEO Sundar Pichai said in an interview with Nikkei this week. “We are now looking at additional investments, be it solar, and evaluating technologies like small modular nuclear reactors, etc.”

He’s referring to next-generation reactors that are still in development and not expected to be ready to connect to the power grid until the 2030s at the earliest. The US Nuclear Regulatory Commission certified a design for an advanced small modular reactor for the first time last year. These advanced reactors are roughly one-tenth to one-quarter the size of their older predecessors; their size and modular design are supposed to make them easier and cheaper to build. They might also be more flexible than larger nuclear plants when it comes to adjusting how much electricity they produce to match changes in demand.

Bill Gates, for one, is all in on nuclear energy. He’s the founder and chair of TerraPower, a company developing small modular reactors. Last year, Microsoft put out a job listing for a principal program manager to lead the company’s nuclear energy strategy that would include small modular reactors.

Bill Gates, for one, is all in on nuclear energy

“I’m a big believer that nuclear energy can help us solve the climate problem, which is very, very important,” Gates said in an interview with The Verge last month.

This week, the Department of Energy released a new report projecting that US nuclear capacity could triple by 2050. After flatlining for years, electricity demand is expected to rise in the US thanks to EVs, new data centers, crypto mining, and manufacturing facilities. That growing demand is changing the outlook for nuclear energy, according to the report. Just a couple years ago, utilities were shutting down nuclear reactors. Now, they’re extending reactors’ lifetimes by up to 80 years and planning to restart ones that have shuttered, it says.

“It is reasonable to think that the tech companies could catalyze a new wave of investment in nuclear, in the US and around the world. There has been plenty of talk about the idea in the industry,” Ed Crooks, Wood Mackenzie senior vice president, thought leadership executive for the Americas wrote in a blog post this week.

This doesn’t necessarily mean that it’s all smooth sailing ahead for nuclear energy in the US. New reactor designs and plans to reopen shuttered nuclear power plants are still subject to regulatory approval. Initiatives to build both old-school power plants and new designs have faced soaring costs and delays. Amazon already faces opposition to its nuclear energy plans in Pennsylvania over concerns that it could wind up driving up electricity costs for other consumers. And the nuclear energy industry still faces pushback over the impact of uranium mining on nearby communities and concerns about where to store radioactive waste.

“It’s an interesting time, challenging in many ways,” Morey says. “We’ll see what happens.”



Source link

Continue Reading

Science

Watch what it’s like to handle an overturned truck full of burning batteries

Published

on

By


A truck full of lithium-ion batteries was knocked over near the Port of Los Angeles on September 26th, exploded, and was left to burn for days — interrupting traffic on highways, and a bridge and shutting down port terminals. A local towing company, Pepe’s Towing Service, caught the explosion on camera and vlogged the incident for days until it was time for them to haul the remnants away.

Pepe’s Tow Service owner Josh Acosta uploaded a lengthy video today chronicling the point of explosion, the long wait as the Fire Department let the batteries burn, and the process of lifting the container full of burnt batteries to transport. In the video, we see what looks like stacks of batteries with liquid cooling pipes between each layer.

Image: Pepe’s Towing Service

Image: Pepe’s Towing Service

Image: Pepe’s Towing Service

In a phone call with The Verge, Acosta says the battery is one “giant container-sized battery” that “does not come apart.” He believes it could be used in buildings for backup power. According to Acosta, the battery weighed 60,000 pounds.

Acosta says he doesn’t remember which company owns the container that transported the battery — but his video blurs out text on the side of the container anyhow.

The video shows the painstaking logistics for firefighters dealing with burning lithium-ion cells — they often need to use thousands of gallons of water to put these out, including on electric vehicle fires. And in this case, the Los Angeles Fire Department told The Verge that the fire kept going on and off.

Acosta told us he was called to the job by the customer who owns the overturned truck, and that’s why he caught the moment on camera. Now, Pepe’s Towing is hauling the remnants of the container for scrap recycling.



Source link

Continue Reading

Trending

Copyright © 2017 Zox News Theme. Theme by MVP Themes, powered by WordPress.