AWS sent a Snowcone to space – TechCrunch

At its re:Mars conference, Amazon today announced that it quietly sent one of its AWS Snowcone edge computing and storage devices into space on the Axiom mission to the International Space Station.

For the most part, this was an off-the-shelf Snowcone, which AWS already built to be rugged enough to be shipped by UPS, though the company had to do months of testing to get it certified for this flight.

“When you think about providing cloud computing to the edge, in remote, disconnected, rugged environments — after 35 years in the space industry — there is no more harsh, remote or rugged environment or unforgiving, quite frankly, than the space environment,” said Clint Crosier, the director of Aerospace and Satellite at AWS and a retired United States Air Force major general who helped oversee the foundation of the US Space Force before he retired and then joined AWS last year. “With space a $425 billion global industry today that’s projected to be a $1 trillion industry by 2040 by all the major analysts — tripling the number of satellites that are launched between 2018 and 2022 — for all those reasons, customers are telling us that they need the same cloud computing capabilities close to their workloads that happen to be off the planet in space as they do on the ground.”

The AWS Snowcone SSD onboard the International Space Station during the Ax-1 mission, prior to installation. Image Credits: AWS

To certify the Snowball, the smallest of the Snow family of edge computing and data transfer devices, AWS had to run it through five months of NASA’s thermal, vacuum, acoustic and vibration testing (with no radiation testing needed because the device was going to be used in the shielded ISS environment). Once it arrived on the space station, the team, which was led by AWS’s Daryl Shuck, connected it, uploaded an ML model for object detection to it and ran it throughout the time of the Axiom mission.

The astronauts on the Axiom mission performed a total of 25 experiments — including the Snowball experiment. As Crosier noted, they had to take pictures and document all of the equipment they brought on board and then transported down with them. The object detection model on the Snowball helped them catalog all of these items (and flag those that were to be excluded from public distribution).

Crosier admitted that this was a relatively simple demonstration but going through the certification process taught the company a lot and also set the stage for future missions. “Tline was the demo that we did In Herbit, thigh the wholesale process, ace we think about the future requirements for cloud computing of space, that’s what we’re really excited about because we think it ushers of a wholesale new era of space innovation — when you Bell now, for the first time ever, bring edge computing capabilities onto orbit,” he said.

And that’s what this is really about. Because the goal here isn’t so much taking existing Snowballs or its larger brethren into space, but taking what the teams learn from these missions (and Amazon is already working with Axiom on future missions) and then maybe integrate more sophisticated edge computing capabilities into satellites, too. What exactly that’ll look like remains to be seen. As any Amazon exec who has gone through the company’s media training will tell you in every interview, the company listens to its customers and works from there.

“We work with our customers to meet their needs,” Crosier said. “That’s one of the hallmarks at AWS and of the things I’ve learned about joining them after 33 years in the US military. And so if customers see the value and need for putting [edge] computing capabilities on satellites, you can rightly expect that we’re listening to that and we’re figuring out how we can meet their needs.”

Already, Amazon and AWS are working with Blue Origin to provide the computing capabilities of its commercial Orbital Reef space station.

Leave a Comment

Your email address will not be published.