How to Contribute to Citizen Science with NASA

A cell phone, a computer—and your curiosity—is all you need to become a NASA citizen scientist and contribute to projects about Earth, the solar system, and beyond.

Science is built from small grains of sand, and you can contribute yours from any corner of the world.

All you need is a cell phone or a computer with an internet connection to begin a scientific adventure. Can you imagine making a pioneering discovery in the cosmos? Want to help solve problems that could improve life on our planet? Or maybe you dream of helping solve an ancient mystery of the universe? All of this is possible through NASA’s Citizen Science program.

NASA defines citizen science, or participatory science, as “science projects that rely on volunteers,” said Dr. Marc Kuchner, an astrophysicist and the Citizen Science Officer in the agency’s Science Mission Directorate in Washington, D.C.

For decades, volunteers have been supporting NASA researchers in different fields and in a variety of ways, depending on the project. They help by taking measurements, sorting data from NASA missions, and deepening our understanding of the universe and our home planet. It all counts.

“That’s science for you: It’s collaborative,” said Kuchner, who oversees the more than 30 citizen science projects NASA offers. “I connect the public and scientists to get more NASA science done.”

Citizen scientists can come from anywhere in the world—they do not have to be U.S. citizens or residents. Volunteers help NASA look for planets in other solar systems, called exoplanets; sort clouds in Earth’s sky; observe solar eclipses; or detect comets and asteroids. Some of those space rocks are even named after the volunteers who helped find them.

Mass participation is key in initiatives that require as many human eyes as possible. “There are science projects that you can’t do without the help of a big team,” Kuchner said. For example, projects that need large datasets from space telescopes—or “things that are physically big and you need people in different places looking from different angles,” he said.

One example is Aurorasaurus, which invites people to observe and classify northern and southern auroras. “We try to study them with satellites, but it really helps to have people on the ground taking photos from different places at different times,” he explained.

“Part of the way we serve our country and humankind is by sharing not just the pretty pictures from our satellites, but the entire experience of doing science,” Kuchner said.

More than 3 million people have participated in the program. Kuchner believes that shows how much people want to be part of what he calls the “roller coaster” of science. “They want to go on that adventure with us, and we are thrilled to have them.”

“You can help scientists who are now at NASA and other organizations around the world to discover interesting things,” said Faber Burgos, a citizen scientist and science communicator from Colombia. “Truth be told, I’ve always dreamed of making history.”

Burgos has been involved in two projects for the past four years: the International Astronomical Search Collaboration (IASC), which searches the sky for potentially dangerous asteroids, and Backyard Worlds: Planet 9. This project uses data from NASA’s now-completed Wide-field Infrared Survey Explorer (WISE) and its follow-up mission, NEOWISE, to search for brown dwarfs and a hypothetical ninth planet.

“There are really amazing participants in this project,” said Kuchner, who helped launch it in 2015. NASA’s WISE and NEOWISE missions detected about 2 billion sources in the sky. “So, the question is: Among those many sources, are any of them new unknowns?” he said.

The project has already found more than 4,000 brown dwarfs. These are Jupiter-sized objects—balls of gas that are too big to be planets, but too small to be stars. Volunteers have even helped discover a new type of brown dwarf.

Participants in the project are also hopeful they’ll find a hypothetical ninth planet, possibly Neptune-sized, in an orbit far beyond Pluto.

Burgos explained that analyzing the images is easy. “If it’s a moving object, it’s obviously going to be something of interest,” he said. “Usually, when you see these images, everything is still. But if there’s an object moving, you have to keep an eye on it.”

Once a citizen scientist marks the object across the full image sequence, they send the information to NASA scientists to evaluate.

“As a citizen scientist, I’m happy to do my bit and, hopefully, one day discover something very interesting,” he said. “That’s the beauty of NASA—it invites everyone to be a scientist. Here, it doesn’t matter what you are, but your desire to learn.”

To become a NASA citizen scientist, start by visiting the program’s website. There you’ll find a complete list of available projects with links to their respective sites. Some are available in Spanish and other languages. Many projects are also hosted on the Zooniverse platform, which has been available since 2006.

“Another cool way to get involved is to come to one of our live events,” said Kuchner. These are virtual events open to the public, where NASA scientists present their projects and invite people to participate. “Pick a project you like—and if it’s not fun, pick a different one,” he advised. “There are wonderful relationships to be had if you reach out to scientists and other participants.”

People of all ages can be citizen scientists. Some projects are kid-friendly, such as Nemo-Net, an iPad game that invites participants to color coral reefs to help sort them. “I’d like to encourage young people to start there—or try a project with one of the older people in their life,” Kuchner said.

Citizen science can also take place in classrooms. In the Growing Beyond Earth project, teachers and students run experiments on how to grow plants in space for future missions. The IASC project also works with high schools to help students detect asteroids.

GLOBE Observer is another initiative with an international network of teachers and students. The platform offers a range of projects—many in Spanish—that invite people to collect data using their cell phones.

One of the most popular is the GLOBE Mosquito Habitat Mapper, which tracks the migration and spread of mosquitoes that carry diseases. “It’s a way to help save lives—tracking the vectors that transmit malaria and Zika, among others,” Kuchner said.

Other GLOBE projects explore everything from ground cover to cloud types. Some use astronomical phenomena visible to everyone. For example, during the 2024 total solar eclipse, participants measured air temperature using their phones and shared that data with NASA scientists.

No prior studies are needed, but many volunteers go on to collaborate on—or even lead—scientific research. More than 500 NASA citizen scientists have co-authored scientific publications.

One of them is Hugo Durantini Luca, from Córdoba, Argentina, who has participated in 17 published articles, with more on the way. For years, he explored various science projects, looking for one where he could contribute more actively.

He participated in NASA’s first citizen science project, Stardust@home, which invites users to search for interstellar dust particles in collectors from the Stardust mission, using a virtual microscope.

In 2014, he discovered Disk Detective, a project that searches for disks around stars, where planets may form. By looking at images from the WISE and NEOWISE missions, participants can help understand how worlds are born and how solar systems evolve.

“And, incidentally, if we find planets or some sign of life, all the better,” said Durantini Luca.

Although that remains a dream, they have made other discoveries—like a new kind of stellar disk called the “Peter Pan Disk,” which appears young even though the star it surrounds is not.

In 2016, Durantini Luca got the chance to support Disk Detective with his own observations from the southern hemisphere. He traveled to El Leoncito Astronomical Complex (CASLEO), an observatory in San Juan, Argentina. There, he learned to use a spectrograph—an instrument that breaks down starlight to analyze its composition.

He treasures that experience. “Curiously, it was the first time in my life I used a telescope,” he said.

While in-person opportunities are rare, both virtual and physical events help build community. Citizen scientists stay in touch weekly through various channels.

“Several of us are friends already—after so many years of bad jokes on calls,” said Durantini Luca.

“People send me pictures of how they met,” said Kuchner. He said the program has even changed how he does science. “It’s changed my life,” he said. “Science is already cool—and this makes it even cooler.” Läs mer…

Earth Science Showcase – Kids Art Collection

On April 16, 2025, the Earth Science Division at NASA’s Ames Research Center in Silicon Valley held an Earth Science Showcase to share its work with the center and their families. As part of this event, kids were invited to share something they like about the Earth. These are their masterpieces.Sora U. Age 9. ”Wildlife”

Wesley P. Age 2.5. ”Pale Blue”

Kira U. Age 5. ”Hawaii”

Anonymous. ”eARTh”

Brooks P. Age 8mo. ”Squiggles” Läs mer…

NASA Tracks Snowmelt to Improve Water Management

As part of a science mission tracking one of Earth’s most precious resources – water – NASA’s C-20A aircraft conducted a series of seven research flights in March that can help researchers track the process and timeline as snow melts and transforms into a freshwater resource. The agency’s Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) installed on the aircraft collected measurements of seasonal snow cover and estimate the freshwater contained in it.
“Seasonal snow is a critical resource for drinking water, power generation, supporting multi-billion dollar agricultural and recreation industries,” said Starr Ginn, C-20A project manager at NASA’s Armstrong Flight Research Center in Edwards, California.  “Consequently, understanding the distribution of seasonal snow storage and subsequent runoff is essential.”
The Dense UAVSAR Snow Time (DUST) mission mapped snow accumulation over the Sierra Nevada mountains in California and the Rocky Mountains in Idaho. Mission scientists can use these observations to estimate the amount of water stored in that snow.

“Until recently, defining the best method for accurately measuring snow water equivalent (SWE) – or how much and when fresh water is converted from snow – has been a challenge,” said Shadi Oveisgharan, principal investigator of DUST and scientist at NASA’s Jet Propulsion Laboratory in Southern California. “The UAVSAR has been shown to be a good instrument to retrieve SWE data.”
Recent research has shown that snow properties, weather patterns, and seasonal conditions in the American West have been shifting in recent decades. These changes have fundamentally altered previous expectations about snowpack monitoring and forecasts of snow runoff. The DUST mission aims to better track and understand those changes to develop more accurate estimates of snow-to-water conversions and their timelines.
“We are trying to find the optimum window during which to retrieve snow data,” Oveisgharan said. “This estimation will help us better estimate available fresh snow and manage our reservoirs better.”

The DUST mission achieved a new level of snow data accuracy, which is partly due to the specialized flight paths flown by the C-20A. The aircraft’s Platform Precision Autopilot (PPA) enables the team to fly very specific routes at exact altitudes, speeds, and angles so the UAVSAR can more precisely measure terrain changes.
“Imagine the rows made on grass by a lawn mower,” said Joe Piotrowski Jr., operations engineer for NASA Armstrong’s airborne science program. “The PPA system enables the C-20A to make those paths while measuring terrain changes down to the diameter of a centimeter.” Läs mer…

2025 EGU Hyperwall Schedule

EGU General Assembly, April 27 – May 2, 2025

Join NASA in the Exhibit Hall (Booth #204) for Hyperwall Storytelling by NASA experts. Full Hyperwall Agenda below.

MONDAY, APRIL 28

10:15 – 10:30 AM —— PACE —— Ivona Cetinic

3:45 – 4:00 PM —— Science Explorer (SciX): Accelerating the Discovery of NASA Science —— Mike Kurtz

4:00 – 4:15 PM —— Juno’s Extended Vision in its Extended Mission —— Glenn Orton

6:05 – 6:20 PM —— Getting the Big Picture with Global Precipitation —— George Huffman

6:20 – 6:35 PM —— Exploring Europa with Europa Clipper —— Jonathan Lunine

TUESDAY, APRIL 29

10:15 – 10:30 AM —— Science Explorer (SciX): Accelerating the Discovery of NASA Science —— Jennifer Lynn Bartlett

10:30 – 10:45 AM —— From ESTO to PACE, A CubeSat’s Journey to Space —— Brent McBride

12:30 – 2:00 PM —— Ask Me Anything with NASA Scientists —— Informal Office Hours

3:45 – 4:00 PM —— Exoplanets (Virtual) —— Jonathan H. Jiang

4:00 – 4:15 PM —— Scattering of Realistic Hydrometeors for Precipitation Remote Sensing ——Kwo-Sen Kuo

6:05 – 6:20 PM —— Space Weather Center of Excellence CLEAR: All-CLEAR SEP Forecast —— Lulu Zhao

WEDNESDAY, APRIL 30

10:15 – 10:30 AM —— SPEXone on PACE: First year in Orbit —— Otto Hasekamp

12:30 – 2:00 PM —— Ask Me Anything with NASA Scientists —— Informal Office Hours

3:45 – 4:00 PM —— Science Explorer (SciX): Accelerating the Discovery of NASA Science —— Jennifer Lynn Bartlett

4:00 – 4:15 PM —— Scattering of Realistic Hydrometeors for Precipitation Remote Sensing ——Kwo-Sen Kuo

6:05 – 6:20 PM —— Ship Tracks Tell the Story of Climate Forcing by Aerosols through Clouds —Tianle Yuan

6:20 – 6:35 PM —— The Excitement of Mars Exploration —— Jonathan Lunine

6:35 – 6:50 PM —— Using NASA Earth Observations for Disaster Response —— Kristen Okorn

THURSDAY, MAY 1

10:15 – 10:30 AM —— Getting the Big Picture with Global Precipitation —— George Huffman

3:45 – 4:00 PM —— PACE —— Morgaine McKibben

4:00 – 4:15 PM —— Using AI to Model Global Clouds Better Than Current GCRMs —— Tianle Yuan

6:05 – 6:20 PM —— Science Explorer (SciX): Accelerating the Discovery of NASA Science —— Mike Kurtz Läs mer…

NASA Airborne Sensor’s Wildfire Data Helps Firefighters Take Action

Data from the AVIRIS-3 sensor was recently used to create detailed fire maps in minutes, enabling firefighters in Alabama to limit the spread of wildfires and save buildings.

A NASA sensor recently brought a new approach to battling wildfire, providing real-time data that helped firefighters in the field contain a blaze in Alabama. Called AVIRIS-3, which is short for Airborne Visible Infrared Imaging Spectrometer 3, the instrument detected a 120-acre fire on March 19 that had not yet been reported to officials.

As AVIRIS-3 flew aboard a King Air B200 research plane over the fire about 3 miles (5 kilometers) east of Castleberry, Alabama, a scientist on the plane analyzed the data in real time and identified where the blaze was burning most intensely. The information was then sent via satellite internet to fire officials and researchers on the ground, who distributed images showing the fire’s perimeter to firefighters’ phones in the field.

All told, the process from detection during the flyover to alert on handheld devices took a few minutes. In addition to pinpointing the location and extent of the fire, the data showed firefighters its perimeter, helping them gauge whether it was likely to spread and decide where to add personnel and equipment.

“This is very agile science,” said Robert Green, the AVIRIS program’s principal investigator and a senior research scientist at NASA’s Jet Propulsion Laboratory in Southern California, noting AVIRIS-3 mapped the burn scar left near JPL by the Eaton Fire in January.

Observing the ground from about 9,000 feet (3,000 meters) in altitude, AVIRIS-3 flew aboard several test flights over Alabama, Mississippi, Florida, and Texas for a NASA 2025 FireSense Airborne Campaign. Researchers flew in the second half of March to prepare for prescribed burn experiments that took place in the Geneva State Forest in Alabama on March 28 and at Fort Stewart-Hunter Army Airfield in Georgia from April 14 to 20. During the March span, the AVIRIS-3 team mapped at least 13 wildfires and prescribed burns, as well as dozens of small hot spots (places where heat is especially intense) — all in real time.

Data from imaging spectrometers like AVIRIS-3 typically takes days or weeks to be processed into highly detailed, multilayer image products used for research. By simplifying the calibration algorithms, researchers were able to process data on a computer aboard the plane in a fraction of the time it otherwise would have taken. Airborne satellite internet connectivity enabled the images to be distributed almost immediately, while the plane was still in flight, rather than after it landed.

The AVIRIS team generated its first real-time products during a February campaign covering parts of Panama and Costa Rica, and they have continued to improve the process, automating the mapping steps aboard the plane.

‘Fan Favorite’

The AVIRIS-3 sensor belongs to a line of imaging spectrometers built at JPL since 1986. The instruments have been used to study a wide range of phenomena — including fire — by measuring sunlight reflecting from the planet’s surface.

During the March flights, researchers created three types of maps. One, called the Fire Quicklook, combines brightness measurements at three wavelengths of infrared light, which is invisible to the human eye, to identify the relative intensity of burning. Orange and red areas on the Fire Quicklook map show cooler-burning areas, while yellow indicates the most intense flames. Previously burned areas show up as dark red or brown.

Another map type, the Fire 2400 nm Quicklook, looks solely at infrared light at a wavelength of 2,400 nanometers. The images are particularly useful for seeing hot spots and the perimeters of fires, which show brightly against a red background.

A third type of map, called just Quicklook, shows burned areas and smoke.

The Fire 2400 nm Quicklook was the “fan favorite” among the fire crews, said Ethan Barrett, fire analyst for the Forest Protection Division of the Alabama Forestry Commission. Seeing the outline of a wildfire from above helped Alabama Forestry Commission firefighters determine where to send bulldozers to stop the spread. 

Additionally, FireSense personnel analyzed the AVIRIS-3 imagery to create digitized perimeters of the fires. This provided firefighters fast, comprehensive intelligence of the situation on the ground.

That’s what happened with the Castleberry Fire. Having a clear picture of where it was burning most intensely enabled firefighters to focus on where they could make a difference — on the northeastern edge. 

Then, two days after identifying Castleberry Fire hot spots, the sensor spotted a fire about 4 miles (2.5 kilometers) southwest of Perdido, Alabama. As forestry officials worked to prevent flames from reaching six nearby buildings, they noticed that the fire’s main hot spot was inside the perimeter and contained. With that intelligence, they decided to shift some resources to fires 25 miles (40 kilometers) away near Mount Vernon, Alabama.

To combat one of the Mount Vernon fires, crews used AVIRIS-3 maps to determine where to establish fire breaks beyond the northwestern end of the fire. They ultimately cut the blaze off within about 100 feet (30 meters) of four buildings. 

“Fire moves a lot faster than a bulldozer, so we have to try to get around it before it overtakes us. These maps show us the hot spots,” Barrett said. “When I get out of the truck, I can say, ‘OK, here’s the perimeter.’ That puts me light-years ahead.”

AVIRIS and the Firesense Airborne Campaign are part of NASA’s work to leverage its expertise to combat wildfires using solutions including airborne technologies. The agency also recently demonstrated a prototype from its Advanced Capabilities for Emergency Response Operations project that will provide reliable airspace management for drones and other aircraft operating in the air above wildfires.

News Media Contacts

Andrew Wang / Jane J. LeeJet Propulsion Laboratory, Pasadena, Calif.626-379-6874 / 818-354-0307andrew.wang@jpl.nasa.gov / jane.j.lee@jpl.nasa.gov

2025-058 Läs mer…

Entrepreneurs Challenge Winner PRISM is Using AI to Enable Insights from Geospatial Data

NASA sponsored Entrepreneurs Challenge events in 2020, 2021, and 2023 to invite small business start-ups to showcase innovative ideas and technologies with the potential to advance the agency’s science goals. To potentially leverage external funding sources for the development of innovative technologies of interest to NASA, SMD involved the venture capital community in Entrepreneurs Challenge events. Challenge winners were awarded prize money, and in 2023 the total Entrepreneurs Challenge prize value was $1M. Numerous challenge winners have subsequently refined their products and/or received funding from NASA and external sources (e.g., other government agencies or the venture capital community) to further develop their technologies.

One 2023 Entrepreneurs Challenge winner, PRISM Intelligence (formerly known as Pegasus Intelligence and Space), is using artificial intelligence (AI) and other advances in computer vision to create a new platform that could provide geospatial insights to a broad community.

Every day, vast amounts of remote sensing data are collected through satellites, drones, and aerial imagery, but for most businesses and individuals, accessing and extracting meaningful insights from this data is nearly impossible.  

The company’s product—Personal Real-time Insight from Spatial Maps, a.k.a. PRISM—is transforming geospatial data into an easy-to-navigate, queryable world. By leveraging 3D computer vision, geospatial analytics, and AI-driven insights, PRISM creates photorealistic, up-to-date digital environments that anyone can interact with. Users can simply log in and ask natural-language questions to instantly retrieve insights—no advanced Geographic Information System (GIS) expertise is required.

For example, a pool cleaner looking for business could use PRISM to search for all residential pools in a five-mile radius. A gardener could identify overgrown trees in a community. City officials could search for potholes in their jurisdiction to prioritize repairs, enhance public safety, and mitigate liability risks. This broad level of accessibility brings geospatial intelligence out of the hands of a few and into everyday decision making.

The core of PRISM’s platform uses radiance fields to convert raw 2D imagery into high-fidelity, dynamic 3D visualizations. These models are then enhanced with AI-powered segmentation, which autonomously identifies and labels objects in the environment—such as roads, vehicles, buildings, and natural features—allowing for seamless search and analysis. The integration of machine learning enables PRISM to refine its reconstructions continuously, improving precision with each dataset. This advanced processing ensures that the platform remains scalable, efficient, and adaptable to various data sources, making it possible to produce large-scale, real-time digital twins of the physical world.

”It’s great being able to push the state of the art in this relatively new domain of radiance fields, evolving it from research to applications that can impact common tasks. From large sets of images, PRISM creates detailed 3D captures that embed more information than the source pictures.” — Maximum Wilder-Smith, Chief Technology Officer, PRISM Intelligence

Currently the PRISM platform uses proprietary data gathered from aerial imagery over selected areas. PRISM then generates high-resolution digital twins of cities in select regions. The team is aiming to eventually expand the platform to use NASA Earth science data and commercial data, which will enable high-resolution data capture over larger areas, significantly increasing efficiency, coverage, and update frequency. PRISM aims to use the detailed multiband imagery that NASA provides and the high-frequency data that commercial companies provide to make geospatial intelligence more accessible by providing fast, reliable, and up-to-date insights that can be used across multiple industries.

What sets PRISM apart is its focus on usability. While traditional GIS platforms require specialized training to use, PRISM eliminates these barriers by allowing users to interact with geospatial data through a frictionless, conversational interface.

The impact of this technology could extend across multiple industries. Professionals in the insurance and appraisal industries have informed the company how the ability to generate precise, 3D assessments of properties could streamline risk evaluations, reduce costs, and improve accuracy—replacing outdated or manual site visits. Similarly, local governments have indicated they could potentially use PRISM to better manage infrastructure, track zoning compliance, and allocate resources based on real-time, high-resolution urban insights. Additionally, scientists could use the consistent updates and layers of three-dimensional data that PRISM can provide to better understand changes to ecosystems and vegetation.

As PRISM moves forward, the team’s focus remains on scaling its capabilities and expanding its applications. Currently, the team is working to enhance the technical performance of the platform while also adding data sources to enable coverage of more regions. Future iterations will further improve automation of data processing, increasing the speed and efficiency of real-time 3D reconstructions. The team’s goal is to expand access to geospatial insights, ensuring that anyone—from city planners to business owners—can make informed decisions using the best possible data. Läs mer…

Testing in the Clouds: NASA Flies to Improve Satellite Data

In February, NASA’s ER-2 science aircraft flew instruments designed to improve satellite data products and Earth science observations. From data collection to processing, satellite systems continue to advance, and NASA is exploring how instruments analyzing clouds can improve data measurement methods.
Researchers participating in the Goddard Space Flight Center Lidar Observation and Validation Experiment (GLOVE) used the ER-2 – based at NASA’s Armstrong Flight Research Center in Edwards, California – to validate satellite data about cloud and airborne particles in the Earth’s atmosphere. Scientists are using GLOVE instruments installed onboard the aircraft to measure and validate data about clouds generated by satellite sensors already orbiting in space around Earth.
“The GLOVE data will allow us to test new artificial intelligence algorithms in data processing,” said John Yorks, principal investigator for GLOVE and research physical scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “These algorithms aim to improve the cloud and aerosol detection in data produced by the satellites.”

The validation provided by GLOVE is crucial because it ensures the accuracy and reliability of satellite data. “The instruments on the plane provide a higher resolution measurement ‘truth’ to ensure the data is a true representation of the atmospheric scene being sampled,” Yorks said.
The ER-2 flew over various parts of Oregon, Arizona, Utah, and Nevada, as well as over the Pacific Ocean off the coast of California. These regions reflected various types of atmospheres, including cirrus clouds, marine stratocumulus, rain and snow, and areas with multiple types of clouds.
“The goal is to improve satellite data products for Earth science applications,” Yorks said. “These measurements allow scientists and decision-makers to confidently use this satellite information for applications like weather forecasting and hazard monitoring.”

The four instruments installed on the ER-2 were the Cloud Physics Lidar, the Roscoe Lidar, the enhanced Moderate Resolution Imaging Spectroradiometer Airborne Simulator, and the Cloud Radar System. These instruments validate data produced by sensors on NASA’s Ice, Cloud, and Land Elevation Satellite 2 (ICESat-2) and the Earth Cloud, Aerosol and Radiation Explorer (EarthCARE), a joint venture between the ESA (European Space Agency) and JAXA (Japan Aerospace Exploration Agency).
“Additionally, the EarthCARE satellite is flying the first ever Doppler radar for measurements of air motions within clouds,” Yorks said. While the ER-2 is operated by pilots and aircrew from NASA Armstrong, these instruments are supported by scientists from NASA Goddard, NASA’s Ames Research Center in California’s Silicon Valley, and the Naval Research Laboratory office in Monterey, California, as well as by students from the University of Iowa in Iowa City and the University of Maryland College Park. Läs mer…