Posted in | News | Climate Change | Ecosystems

A New Benchmark for AI Image Analysis

An international team of scientists has developed a new technique to evaluate how well AI algorithms can mine image databases for additional information.

Sample images from the iNaturalist citizen science website. Image Credit: mferral (left), anacarohdez (middle) and lexthelearner (right). All CC BY-NC

The development could assist scientists in creating AI-powered algorithms capable of conducting rapid, detailed analyses of the millions of wildlife images shared online by the public each year.

Researchers believe that these analyses could provide valuable insights into the impacts of climate change, pollution, habitat loss, and other environmental pressures on a wide range of animal and plant species.

Rich Source

Citizen science websites offer a valuable resource for understanding how animals and plants are adapting to climate change.

While current AI systems can automatically identify species in uploaded images, their ability to provide additional information has been less clear.

Such information could include details about species' diets, health, and interactions with other species.

Image Tool

The tool, known as INQUIRE, evaluates AI's capacity to draw insights from a collection of five million wildlife photos uploaded to the iNaturalist citizen science platform.

The study found that while current AI systems can answer some questions, they struggle with more complex ones.

These challenges include interpreting fine details within images and understanding advanced scientific terminology.

The team suggests that the findings highlight opportunities to develop improved AI algorithms to help scientists analyze large image datasets more effectively.

This careful curation of data, with a focus on capturing real examples of scientific inquiries across research areas in ecology and environmental science, has proven vital to expanding our understanding of the current capabilities of current AI methods in these potentially impactful scientific settings. It has also outlined gaps in current research that we can now work to address, particularly for complex compositional queries, technical terminology, and the fine-grained, subtle differences that delineate categories of interest for our collaborators.

Dr. Sarah Beery, Assistant Professor, Massachusetts Institute of Technology

The peer-reviewed findings will be presented at the NeurIPS conference, a leading event in machine learning research.

The study was conducted by researchers from the University of Edinburgh, University College London, UMass Amherst, iNaturalist, and the Massachusetts Institute of Technology (MIT), with partial funding from the Generative AI Laboratory at the University of Edinburgh.

The thousands of wildlife photos uploaded to the internet each day provide scientists with valuable insights into where different species can be found on Earth. However, knowing what species is in a photo is just the tip of the iceberg. These images are potentially a hugely rich resource that remains largely untapped. Being able to quickly and accurately comb through the wealth of information they contain could offer vital clues about how species are responding to multi-faceted challenges like climate change.

Dr. Oisin Mac Aodha, Associate Professor, School of Informatics, The University of Edinburgh

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.