Editor’s note: News about conservation and the environment is made every day, but some of it can fly under the radar. In a recurring feature, Human Nature shares three stories from the past week that you should know about.
The story: A study published Wednesday found that as much as 54 percent of high seas fishing would be unprofitable were it not for governments covering some of the industry’s costs, Sarah Gibbens reported for National Geographic. The researchers also found that exploited labor and underreported catch may explain how large vessels afford to fish in international waters.
The big picture: In 2016, just over 3,600 vessels actively fished on the high seas — the open ocean outside any country’s jurisdiction. Both profitable and unprofitable companies were given subsidies by their governments. “The study confirmed that much of the high seas fishing does not make sense,” said Enric Sala, study author and National Geographic explorer-in-residence. “If it’s ecologically destructive and economically unprofitable, why don’t we end all high seas fishing?”
Read more here.
STAY UP TO DATE
Sign up to read more of nature’s big stories.
The story: Researchers at Auburn University, Harvard, Oxford, the University of Minnesota and the University of Wyoming have developed a machine learning algorithm that can identify, describe and count wildlife with 96.6 percent accuracy, Kyle Wiggers reported for VentureBeat on June 5. The researchers trained the computer algorithm using 3.2 million images from Snapshot Serengeti, a citizen science project that recruits volunteers to collect images of elephants, giraffes, gazelles, lions, cheetahs and other animals in their natural habitats. More than 50,000 people with 225 camera traps contributed to the project.
The big picture: The work builds on a growing field of study in artificial intelligence. As part of the Tropical Ecology Assessment and Monitoring (TEAM) Network, Conservation International and partners in countries across Africa, the Americas and Asia have been meticulously gathering data on changes in biomass, biodiversity, species distribution and other indicators of ecosystem health. “This technology lets us accurately, unobtrusively and inexpensively collect wildlife data, which could help catalyze the transformation of many fields of ecology, wildlife biology, zoology, conservation biology and animal behavior into ‘big data’ sciences,” said Jeff Clune, associate professor at the University of Wyoming, senior research manager at Uber’s Artificial Intelligence Labs and senior author of the study. “This will dramatically improve our ability to both study and conserve wildlife and precious ecosystems.”
Read more here.
The story: Hurricanes experienced a 10 percent slowdown in storm speed between 1949 and 2016, Chris Mooney reported for The Washington Post on Wednesday. In the Atlantic region, storms moved 20 percent slower over land. For hurricanes, slower is not better. Slower-moving storms will rain more over a given area, batter that area longer with winds and dump more water as they approach shorelines, according to Jim Kossin, a scientist with the National Oceanic and Atmospheric Administration and the author of the research.
The big picture: The question of hurricane speed — and whether it has changed due to climate change — has drawn little attention in the past in comparison with other questions, such as whether storms are getting stronger overall. “Every one of the hazards that we know tropical cyclones carry with them, all of them are just going to stick around longer,” Kossin said. “And so that’s never a good thing.”
Read more here.
Morgan Lynch is a staff writer for Conservation International.
- Three things we’re reading about hurricanes
- Fishing ban in remote Pacific waters is working, report finds
- ‘Big data’ is an investment in nature — and human well-being