Federal News Network: NASA JPL building models of its petabytes of data with artificial intelligence

“NASA Jet Propulsion Laboratory is capturing more data than ever in its history. That has Program Manager and Principal Computer Scientist Daniel Crichton excited about his mission: Using data to understand Earth, the solar system and beyond.

Artificial intelligence and machine learning are critical to automating or extracting insight from the hundreds of petabytes of data generated by JPL. From this planet, to the Mars rover and observatories touring outer space, machine learning and AI present immense opportunities, he said. In many cases, better tools are to credit – or blame – for more data…”

“’One of the big challenges we have in our world, in science is trying to build a representation training set that really can capture the totality of what we want to be able to discover in our data,’ Crichton said on Federal Drive with Tom Temin. ‘We may discover anomalies in the data and identify new features that we want to be able to look for, and go back and update our training sets and improve our models. And so, it’s really an iterative process of trying to actually train our models, discover new things, and reclassify the kinds of information that we’ve actually even seen in the past.’

To do that, Crichton said it is important to have sufficient metadata. That will help different scientists work together to solve a problem. To this end, JPL works to set worldwide standards for planetary missions, and develop standard metadata structures…”

“Then there is the matter of housing the data and keeping it available. Physical data can reside in cloud infrastructures such as Amazon, Google and Microsoft’s Azure, and the metadata can point to those infrastructures. As JPL reaches the petabyte level of data, Crichton said, the organization needs a way to scale up storage, which, along with complexity is a major challenge for missions.

‘This is going to become the permanent record of what we’ve learned from our missions,’ he said. ‘And so it’s very important that we treat that as a long-term archive, that we put in good practices of how we actually do quality checking of that data, that we look at ways in which we can ensure the integrity of it long term – and that we really treat it as the golden assets of our space age.’…” Read the full article here.

Source: NASA JPL building models of its petabytes of data with artificial intelligence – By Amelia Brust, May 11, 2021. Federal News Network.

0
Tags:

This topic contains 0 replies, has 1 voice, and was last updated by  Jackie Gilbert 2 months, 2 weeks ago.

  • Author
    Posts
  • #127062

    Replies viewable by members only

    0

You must be logged in to reply to this topic.

CONTACT US

Questions?. Send us an email and we'll get back to you, asap.

Sending

©2021 MileMarker10, LLC all rights reserved | Community and Member Guidelines | Privacy Policy | About G2Xchange FedCiv

Opportunities. Starting Points.

About our Data

The Vault is a listing of expiring contracts, task orders, etc. within a certain set of parameters, to include:

  • Have an initial total estimated contract value of $10 million or above
  • Federal Civilian Only – DHS, Transportation, Justice, Labor, Interior, Commerce, Energy, State, and Treasury Actions
  • NAICS codes include: 511210, 518210, 519130, 519190, 541511,
    541512, 
    541513, 541519, 541611, 541618,
    541690, 541720, 541990
  • Were modified within the last 12 calendar months
  • The data represented is based on information provided by the government

Who has access? Please note that ALL G2Xchange FedCiv Members will receive access to all basic and much of the advanced data. G2Xchange FedCiv Corporate Members will receive access to ALL Vault content (basic and advanced).

Feedback/Suggestions? Contact us at Vault@G2Xchange.com and let us know what you think. 

G2Xchange FedCiv

Log in with your credentials for G2Xchange FedCiv

Forgot your details?