Links & Resources
Below are some links to ideas, explanations, and examples of how people around the world today are pioneering new technologies for automated detection, accounting, and redress of unwelcome effects upon third parties, such as pollution.
“Due to poorly defined property rights, externalities both in their positive and negative forms exist everywhere around us. This creates a high level of inefficiency as many of those costs and benefits are accounted for in the marginal costs and benefits of market prices. In order to reach high levels of allocative efficiency, those externalities must be internalized. A system is considered optimal and efficient only when it reaches Pareto optimality.
The most common approach of addressing those externalities is through a centralized system, that assigns property rights and enforces sanctions. This approach is very costly and inefficient in that it has very high transactional costs and poses many political and ethical questions about valuation of damage/benefit, individual choice, and political economics of enforcement of the law.
Another approach is the bargaining between involved parties in a decentralized system. This can only be possible when transactional costs are low enough to allow defining, enforcing, and trading property rights without the help of a central authority. Currently issues of information asymmetry associated with a decentralized system, and a need for a central authority to enforce the rights and execute sanctions make this very difficult to implement in the real world.”
“While some companies are taking the lead in developing their own ways of factoring environmental impacts into financial decision-making, public policies that put a price on environmental impacts would help all companies account for environmental externalities. These policies include those that put a price on greenhouse gas emissions or water use, as well as policies that create demand for more efficient products like vehicle fuel efficiency standards or building codes.
While the impact of policies like this will vary across businesses, some companies are already benefiting. Environmental regulations, high energy prices, and a price on carbon in Europe, for example, have helped Siemens grow its portfolio of environmental products, including things like wind turbines, highly efficient combined cycle power plants, and efficient trains. Siemens has not only helped its customers reduce 332 million tons of carbon dioxide emissions—equivalent to about 40 percent of Germany’s annual emissions—but it generated $44.6 billion in revenue from its environmental product portfolio in fiscal year 2012.”
“What Are Externalities? There are varying definitions of externalities, but probably the most common definition is that externalities are beneficial or harmful effects of one's action on others that were not taken into account in the decision to act. For example, one of the common examples used is industrial emissions of gases into the atmosphere. It is said that the factory owner(s) or manager(s) would not take into account the harmful effect of the emitted gases on other members of the society. Consequently, the factories would produce more industrial output than they would have produced had they taken into consideration the negative effects of their actions on others. This would be a negative externality.
However, there are also positive externalities, where one unintentionally produces benefits to others. A frequently used example is education. In this case, too little of the beneficial activity (education) is being performed if left to individuals' voluntary transactions. As a result, both in the cases of negative and positive externalities, "inefficiencies" arise. It is claimed that the total social welfare could be increased by adjusting the amount of the externality-creating activities to their socially optimal levels.”
“Sumac Kawsay Buen Vivir ("good living") rooted in the cosmovisión (or worldview) of the Quechua peoples of the Andes, sumak kawsay – or buen vivir, to give it its Spanish name – describes a way of doing things that is community-centric, ecologically-balanced and culturally-sensitive.[1] The concept is related to tradition of legal and political scholarship advocating legal standing for the natural environment.[2] The rights approach is a break away from traditional environmental regulatory systems, which regard nature as property.
Since 2000 animals, plants and other organisms have their rights to dignity recognised by the Constitution in Switzerland (art. 120), but the implications of this disposition are still not very clear. With the enactment of its 2008 Constitution, Ecuador became the first country in the world to codify the Rights of Nature and to inform a more clarified content to those rights. Articles 10 and 71–74 of the Ecuadorian Constitution recognize the inalienable rights of ecosystems to exist and flourish, gives people the authority to petition on the behalf of ecosystems, and requires the government to remedy violations of these rights.”
“What are the nitrogen dioxide (NO2) levels caused by traffic 1km from your house? How do the levels small particulate matter (PM2.5) in your area compare to those in other cities? OpenAQ is a real-time, interactive air quality monitoring website that enables visitors to find out about air quality monitoring and levels in their local area. OpenAQ’s mission is to provide the data to influence policy, and to enable the public to access information on air pollution through open data and open-source tools.
OpenAQ has collected a wealth of data: over 324 million air quality measurements from over ten thousand locations in 68 countries, as well as aggregated data from 114 government level and research-grade sources. Visitors to the website can locate the air quality monitoring sites closest to them, view the NO2, PM2.5 and PM10 levels from those local monitors, and use the data to create graphs, heatmaps, spreadsheets and other tools for analysis. Users can also directly compare air quality in two locations.”
“…A team of scientists in China proposed an intriguing way to track unfamiliar drones through crowdsensing. Their approach leverages participants’ smartphones to detect the Wi-Fi signals of drones.
Their system, which the researchers call CEDAR (for Cost-Effective Crowdsensing System for Detecting and LocAlizing Drones), can detect drones within 350 meters and with an average accuracy of 87 percent when no preliminary MAC addresses or SSIDs are listed in the database, suggesting that this approach is fairly effective.
As for the crowdsensing aspect of CEDAR, Shi acknowledges that participants will have to yield some of their privacy rights to assist with drone detection.
“In our system, users need to upload their positions for both detection and localization process,” Shi explains. “In this way, they do have a privacy breach. We treat users’ privacy concerns as a kind of cost, and compensate them by providing rewards.”
Thus the researchers proposed an auction-like scenario, where participants can decide the threshold price at which they are willing to share their location, and then bid against other people nearby for the opportunity.”
“Ever since 2014, when the National Oceanic and Atmospheric Administration (NOAA) relaxed the limit from 50 to 25 cm, that resolution has been fine enough to satisfy most customers. Investors can predict oil supply from the shadows cast inside oil storage tanks. Farmers can monitor flooding to protect their crops. Human rights organizations have tracked the flows of refugees from Myanmar and Syria.
…
Some of the most radical developments in Earth observation involve not traditional photography but rather radar sensing and hyperspectral images, which capture electromagnetic wavelengths outside the visible spectrum. Clouds can hide the ground in visible light, but satellites can penetrate them using synthetic aperture radar, which emits a signal that bounces off the sensed object and back to the satellite. It can determine the height of an object down to a millimeter.
…
Meanwhile, farmers can use hyperspectral sensing to tell where a crop is in its growth cycle, and geologists can use it to detect the texture of rock that might be favorable to excavation.”
“Carbon Tracker today announced a new project, funded by a $1.7 million grant from Google.org, which will use satellite imagery to quantify carbon emissions from all large power plants worldwide and make this information available to the public. Carbon Tracker, in collaboration with WattTime and the WRI, were chosen through the Google AI Impact Challenge to use the data to hold polluting plants accountable to environmental standards and to enable advanced new emissions reduction technologies.
…
The project will work by leveraging the growing global satellite network to observe power plants from space. AI technology will use the latest image processing algorithms to detect signs of power plant emissions. For maximum accuracy, the project will combine data from a variety of different sensors operating at different wavelengths. AI algorithms will cross-validate multiple indicators of power plant emissions, from thermal infrared, indicating heat near smoke stacks and cooling water intake, to visual spectrum recognition that a power plant is emitting smoke.”
“Finding destroyed villages in Darfur (Amnesty International), mapping the encroachment of palm oil plantations(Greenpeace) or schools in remote areas (Unicef), identifying civilian presence in conflict zones (UNHCR), predicting the risk of floods (National Geographic) — are all important efforts by non-governmental organizations (NGO) that rely on a common tool: satellite imagery.
When combined with the latest image-processing techniques from artificial intelligence (AI), satellite images can help experts monitor human rights and the environment, providing NGOs with the capacity to improve countless lives and contribute to the delivery of the UN Sustainable Development Goals.
This is feasible. The algorithms are there. The satellites are there.
Low-resolution imagery is cheap or sometimes free, and it is frequently updated, but it lacks the detailed information needed to identify some patterns indicative of climate change or human rights violations. Yet the staggering cost of high-resolution satellite imagery is a significant barrier for NGOs.
This is what super-resolution technology can help overcome.”
“When researchers collect audio recordings of birds, they are usually listening for the animals’ calls. But conservation biologist Marc Travers is interested in the noise produced when a bird collides with a power line. It sounds, he says, ‘very much like the laser sound from Star Wars’.”
With some 600 hours of audio collected — a full 25 days’ worth — counting the laser blasts manually was impractical. So, Travers sent the audio files (as well as metadata, such as times and locations) to Conservation Metrics, a firm in Santa Cruz, California, that uses artificial intelligence (AI) to assist wildlife monitoring. The company’s software was able to detect the collisions automatically and, over the next several years, Travers’ team increased its data harvest to about 75,000 hours per field season.
Results suggested that bird deaths as a result of the animals striking power lines numbered in the high hundreds or low thousands, much higher than expected. “We know that immediate and large-scale action is required,” Travers says.”
“More than a dozen governments and companies have or are planning to launch satellites that measure concentrations of heat-trapping gases such as methane, which is blamed for about one quarter of man-made global warming. They are looking to track nations, industries, companies and even individual facilities to identify some of the biggest contributors to climate change.
“Space-based technologies are allowing us for the first time to quickly and cheaply measure greenhouse gases,” said Mark Brownstein, a senior vice president at Environmental Defense Fund, which plans to launch its MethaneSAT in 2021. “Oftentimes both government and industry are not fully aware of the magnitude of the opportunity to cut emissions. With that data, they can take action.
Regulators are taking note. California is partnering with Planet Labs Inc. on a satellite to help it “pinpoint individual methane plumes” from oil and gas facilities, as well as other sources such as landfills, dairies and waste water plants, Stanley Young, a spokesman for the state’s Air Resources Board, said in an email. Researchers have suggested that methane is underestimated in most inventories, he said.”
“It’s hard to respond to threats we can’t see, hear or touch, such as pollution of air and water, or toxic chemicals in products we buy. Sophisticated, inexpensive sensors are making the invisible visible, and have the potential to help solve tough environmental challenges.
There are countless other examples of sensor technology in the works that can help solve a host of environmental challenges:
– Water meters that detect leaks
– Traffic sensors to help reduce congestion
– Real-time fishing data radioed from ship to shore, to help manage catch limits
– GPS-enabled tractors that could prevent farmers from using too much fertilizer
– Wearable air pollution sensors that would crowdsource pollution hot spot data
Imagine how powerful the new world of information will be when low-cost sensors can provide data directly to citizens and advocacy groups.”
“This summer, the Yurok Tribe declared rights of personhood for the Klamath River—likely the first to do so for a river in North America. A concept previously restricted to humans (and corporations), “rights of personhood” means, most simply, that an individual or entity has rights, and they’re now being extended to nonhumans.
. . .
With the declaration, the Yurok Tribe joins other Indigenous communities in a growing Rights of Nature movement aimed at protecting the environment. Last year, the White Earth Band of Ojibwe adopted the Rights of Manoomin to protect wild rice—manoomin—and the freshwater sources it needs to survive in Minnesota. And in 2017, the New Zealand government adopted the Rights of the Whanganui River, stemming from a treaty process with Māori iwis, or tribes, that gives the river its own legal standing in court.”
“Motion-sensor cameras in natural habitats offer the opportunity to inexpensively and unobtrusively gather vast amounts of data on animals in the wild. A key obstacle to harnessing their potential is the great cost of having humans analyze each image. Here, we demonstrate that a cutting-edge type of artificial intelligence called deep neural networks can automatically extract such invaluable information. For example, we show deep learning can automate animal identification for 99.3% of the 3.2 million-image Snapshot Serengeti dataset while performing at the same 96.6% accuracy of crowdsourced teams of human volunteers. Automatically, accurately, and inexpensively collecting such data could help catalyze the transformation of many fields of ecology, wildlife biology, zoology, conservation biology, and animal behavior into “big data” sciences.”
“We devised a ‘DNA-of-things’ (DoT) storage architecture to produce materials with immutable memory. In a DoT framework, DNA molecules record the data, and these molecules are then encapsulated in nanometer silica beads, which are fused into various materials that are used to print or cast objects in any shape.
. . .
DoT could be applied to store electronic health records in medical implants, to hide data in everyday objects (steganography) and to manufacture objects containing their own blueprint.”
“The first satellite designed to continuously monitor the planet for methane leaks made a startling discovery last year: A little known gas-well accident at an Ohio fracking site was in fact one of the largest methane leaks ever recorded in the United States.
The findings by a Dutch-American team of scientists, published Monday in the Proceedings of the National Academy of Sciences, mark a step forward in using space technology to detect leaks of methane, a potent greenhouse gas that contributes to global warming, from oil and gas sites worldwide. The scientists said the new findings reinforced the view that methane releases like these, which are difficult to predict, could be far more widespread than previously thought.
“We’re entering a new era. With a single observation, a single overpass, we’re able to see plumes of methane coming from large emission sources,” said Ilse Aben, an expert in satellite remote sensing and one of the authors of the new research.”
“To the naked eye, there is nothing out of the ordinary at the DCP Pegasus gas processing plant in West Texas, one of the thousands of installations in the vast Permian Basin that have transformed America into the largest oil and gas producer in the world.
But a highly specialized camera sees what the human eye cannot: a major release of methane, the main component of natural gas and a potent greenhouse gas that is helping to warm the planet at an alarming rate.
Two New York Times journalists detected this from a tiny plane, crammed with scientific equipment, circling above the oil and gas sites that dot the Permian, an oil field bigger than Kansas. In just a few hours, the plane’s instruments identified six sites with unusually high methane emissions.”
“Just 100 companies have been the source of more than 70% of the world’s greenhouse gas emissions since 1988, according to a new report.
The Carbon Majors Report (pdf) “pinpoints how a relatively small set of fossil fuel producers may hold the key to systemic change on carbon emissions,” says Pedro Faria, technical director at environmental non-profit CDP, which published the report in collaboration with the Climate Accountability Institute.
Traditionally, large scale greenhouse gas emissions data is collected at a national level but this report focuses on fossil fuel producers. Compiled from a database of publicly available emissions figures, it is intended as the first in a series of publications to highlight the role companies and their investors could play in tackling climate change.
The report found that more than half of global industrial emissions since 1988 – the year the Intergovernmental Panel on Climate Change was established – can be traced to just 25 corporate and state-owned entities. The scale of historical emissions associated with these fossil fuel producers is large enough to have contributed significantly to climate change, according to the report.”
“Sara Talpos reports on a controversial new strategy some lawyers are using to take on manufacturers of PFAS — industrial chemicals that don't fully degrade and have been linked to a laundry list of health problems. At first blush, contaminated groundwater might seem unlikely to fall alongside punches and kicks under the umbrella of "battery," but some legal scholars say it should.
With PFAS litigation on the rise, the battery claim is certain to be further tested in courtrooms nationwide, and cases in Belmont, Michigan now offer a glimpse of both the promise and challenges of a battery claim in answering a foundational question of modern life: Should industrial compounds be allowed to penetrate the bodies of ordinary citizens when the safety of those compounds has not been established and consent has not been granted?”
“Now Sonnewald and her colleagues at MIT have developed an unsupervised machine-learning technique that automatically combs through a highly complicated set of global ocean data to find commonalities between marine locations, based on their ratios and interactions between multiple phytoplankton species. With their technique, the researchers found that the ocean can be split into over 100 types of “provinces” that are distinct in their ecological makeup. Any given location in the ocean would conceivably fit into one of these 100 ecological provinces.
. . .
’Instead of guiding sampling with tools based on bulk chlorophyll, and guessing where the interesting ecology could be found with this method, you can surgically go in and say, ‘this is what the model says you might find here,’ Sonnewald says. ‘Knowing what species assemblages are where, for things like ocean science and global fisheries, is really powerful.’”
“It was 12 miles wide, invisible to the naked eye and traveled across six counties to Florida's largest city. And it's still unclear who — or what — was responsible.
. . .
The source of the Florida emission remains unknown, however. Its volume was equivalent to roughly 1% of total daily emissions from the U.S. natural gas system in 2018, Stanford University professor Adam Brandt said. Its epicenter was in Alachua County, according to Bluefield. The cloud moved over six counties in northern Florida that are home to more than 1.5 million people.
. . .
While the technology to spot leaks is improving, there can often be the challenge of pinpointing the perpetrator. There aren’t many industrial facilities nearby and among the closest potential heavy-emitter candidates are a natural gas pipeline system and power plants, public records show.”
“The study includes data from thousands of larval dragonfly specimens collected from nearly 500 locations across 100 sites within the U.S. National Park System. The survey was collected from 2009 through 2018 as part of the national Dragonfly Mercury Project.
. . .
‘The support of citizen scientists around the country created the opportunity for this study to have such significance. This is a terrific example of how public outreach around science can bring results that help the entire country,’ said Chen.
Methylmercury, the organic form of the toxic metal mercury, poses risks to humans and wildlife through the consumption of fish. Mercury pollution comes from power plants, mining and other industrial sites. It is transported in the atmosphere and then deposited in the natural environment, where wildlife can be exposed to it.”
“Cigarette ends account for 66 per cent of all litter on the streets and Keep Britain Tidy estimates 226 million butts were dropped in England last year. Laid end to end, they would stretch 3,567 miles –more than the distance between London and New York.
. . .
The precise cost of cleaning up the discarded ends is unknown, but the total amount spent by local authorities on litter stands at more than £1 billion a year.
. . .
‘We know that many smokers don’t even consider their butts to be litter so there is a lot of work to do if we are to rid our environment of this menace.’”
“The new ElephantEdge tracker is considered the most advanced of its kind, with eight years of battery life and hundreds of miles worth of LoRaWAN networking repeaters range, running TinyML models that will provide park rangers with a better understanding of elephant acoustics, motion, location, environmental anomalies and more. The tracker can communicate with an array of sensors, connected by LoRaWAN technology to park rangers’ phones and laptops.
. . .
This gives rangers a more accurate image and location to track than earlier systems that captured and reported on pictures of all wildlife, which ran down the trackers’ battery life. The advanced ML software that runs on these trackers is built explicitly for elephants and developed by the Hackster.io community in a public design challenge.”
“There are far more trees in the West African Sahara Desert than you might expect, according to a study that combined artificial intelligence and detailed satellite imagery.
Researchers counted over 1.8 billion trees and shrubs in the 1.3 million square kilometer (501,933 square miles) area that covers the western-most portion of the Sahara Desert, the Sahel, and what are known as sub-humid zones of West Africa.
“We were very surprised to see that quite a few trees actually grow in the Sahara Desert, because up until now, most people thought that virtually none existed,” says Martin Brandt, professor in the geosciences and natural resource management department at the University of Copenhagen and lead author of the study in Nature.”
“At any given time, about half of the world is in darkness and half of it is covered in clouds. Capella Space CEO Payam Banazadeh told Futurism, “When you combine those two together, about 75 percent of Earth, at any given time, is going to be cloudy, nighttime, or it’s going to be both.”
. . .
Then, the satellite collects the returning signals to create a picture of what is there. “At that frequency, the clouds are pretty much transparent,” Banazadeh tells Futurism. “You can penetrate clouds, fog, moisture, smoke, haze. Those things don’t matter anymore. And because you’re generating your own signal, it’s as if you’re carrying a flashlight. You don’t care if it’s day or night.”
. . .
Interestingly, clouds aren’t the only thing that SAR imagery can see through. Capella notes that the technology can peer directly through the walls of some buildings. This means a picture could reveal the floorplan of a house or a collection of planes in a hangar at the airport.
. . .
For instance, researchers could request images of the Amazon rainforest to monitor things like deforestation and illegal logging. Since SAR can see through the dense clouds that typically cover the area, Capella’s imaging technology would be a major asset.
Meanwhile, the company says that savvy investors could utilize its technology to monitor global supply chains and commodities. By combining the imaging capabilities of multiple satellites, Capella is able to construct three-dimensional pictures.”
“To increase the efficiency and accuracy of koala counts, Hamilton and his team developed a methodology that uses drones, thermal cameras, and AI. At first, Hamilton said that using a drone that flies overhead to detect an animal that lives high in a canopy seemed like a bit of a "no brainer," but this tech-enhanced approach presented a number of challenges.
To sift through these drone-collected thermal images and assist with identification the team developed machine learning algorithms, but training these models also came with a bit of a learning curve, Hamilton explained.
. . .
After extensive training and development, Hamilton said the AI-enabled methodology is now more accurate than people at detecting koalas. Aside from increased precision, this approach also allows researchers to cover exponentially more ground in less time. Hamilton estimated that a team of four researchers could cover about 10 hectares in a day, and the drone-enabled AI detection method allows them to cover 50 hectares in two hours.”
“A new study, though, aims to use satellite and machine learning to track ships that traffic laborers. The findings provide a conservative estimate that between 57,000 and 100,000 people were forced to labor on fishing vessels between 2012 and 2018. Though AI alone can’t end what the study calls a “humanitarian tragedy,” it can help start to penetrate the veil of secrecy around slave labor and end its practice on the high seas.
The study, published in the Proceedings of the National Academy of Sciences on Monday, uses data captured from the Automatic Identification System, a satellite tracking system used to monitor ships’ movements around the world. Not all ships use them all the time—the study notes that some turn them off to reportedly avoid piracy—but those that do can allow researchers to construct a fairly comprehensive web of where ships go, when, and how they behave. The scientists took that data (including when AIS was turned off) and compared it to known cases of ships that used forced labor and interviews with experts in trafficking to train a machine learning tool that could identify ships likely reliant on trafficked labor.”
“Here, artificial and biological monitoring systems ensure that the water pumped throughout the city’s pipes is safe to drink. The artificial systems take precise measurements of chemical contamination in the water, which is definitely handy. However, as Aquanet.pl explains, it is the plant’s biological systems (or ‘bioindicators’) that allow for a more reliable estimation of the water’s overall toxicity, as they account for a broad range of factors “simultaneously”.
These biological systems are comprised of eight mussels with sensors hot-glued to their shells. They work together with a network of computers and have been given control over the city’s water supply. If the waters are clean, these mussels stay open and happy. But when water quality drops too low, they close off and shut the water supply of millions of people with them.“
“First, AI can enhance the accuracy of forest monitoring. For example, data science company Gramemer has used Convolutional Neural Networks with transfer learning to predict plant and tree species from close to 675,000 images, achieving 85 percent accuracy—a level comparable to that achieved by human experts.
Some organizations install sensors for monitoring rainforests. IBM’s analytics-driven software InfoSphere Stream processes over 10,000 data points per second generated by sensor networks measuring carbon levels, soil moisture, relative humidity, and atmospheric pressure in Brazil, among other places. Conservationists and researchers use the software to predict droughts and forest fires and to assess how rainforests respond to deforestation and climate change.
The non-profit organization Rainforest Connection (RFCx) relies on acoustic monitoring systems to help fight illegal deforestation in real-time, including in Amazon basin countries like Brazil, Ecuador, and Peru. Team members strategically place recycled cell phones in rainforests and the devices send instant notifications to rangers when they detect the sound of a chainsaw, which helps to curb not only illegal deforestation, but also poaching of local species. In addition, a cloud-based database of animal sounds gathered through this method allows researchers and governments to document and track wildlife, including endangered birds and mammals.”
“As part of CSIRO’s research to end plastic waste, we’ve been developing an efficient and scalable environmental monitoring system using artificial intelligence (AI).
The system, which is part of a larger pilot with the City of Hobart, uses AI-based image recognition to track litter in waterways.
. . .
Our data revealed food packaging, beverage bottles and cups were by far the most frequently spotted litter items across all three countries.“
“The pictures come from an Earth-observation satellite orbiting 600km (372 miles) above the planet's surface.
The breakthrough could allow up to 5,000 sq km of elephant habitat to be surveyed on a single cloud-free day.
. . .
"And conservation organisations are already interested in using this to replace surveys using aircraft."
Conservationists will have to pay for access to commercial satellites and the images they capture.
But this approach could vastly improve the monitoring of threatened elephant populations in habitats that span international borders, where it can be difficult to obtain permission for aircraft surveys.”
“In 1996, Prof Shahid Naeem was part of a team of researchers who set out to value the Earth. Specifically, they were trying to establish the dollar value of all of the “ecosystem services” the planet provides to humans every year. Around $33tn, they concluded, nearly double global GDP at the time.
“The team was half ecologists and half economists. The ecologists found the exercise really scary but understood the utility of it. The economists felt nature could be valued but they disagreed about how it could be done,” Naeem says.
. . .
More than half of global GDP – $42tn (£32tn) – depends on high-functioning biodiversity, according to the insurance firm Swiss Re. The “natural capital” that sustains human life looks set to become a trillion dollar asset class: the cooling effect of forests, the flood prevention characteristics of wetlands and the food production abilities of oceans understood as services with a defined financial value. Animals, too.
The services of forest elephants are worth $1.75m for each animal, the International Monetary Fund’s Ralph Chamihas estimated; more than the $40,000 a poacher might get for shooting the mammal for ivory. Whales are worth slightly more at over $2m, he also estimates, due to their “startling” carbon capture potential, and therefore deserve better protection.“
“As a concept, it’s simple. Single Earth tokenizes land, forests, swamps, and biodiversity: any area of rich ecological significance. Companies, organizations, and eventually individuals will be able to purchase those tokens and own fractional amounts of those lands and natural resources, getting carbon offsets in return as well as ongoing ownership rights.
. . .
“Single Earth is not a fund itself,” chief technical officer Andrus Aaslaid told me. “We are not trying to become the largest landowner in the world. We are trying to provide the technology that people who own the land would be able to create profit out of it without having to sell it as raw material.”
The interesting thing about the Single Earth token is that it is nature-backed. It has real assets in the real world with real value behind it.
In some sense, that’s like gold: also a real, physical, inherently valuable store of wealth.
“Now, Imazon researchers have built an artificial intelligence algorithm to find such roads automatically. Currently, the algorithm is reaching about 70% accuracy, which rises to 87%-90% with some additional automated processing, said Souza. Analysts then confirm potential roads by examining the satellite images.
The laborious work of mapping roads by hand was not wasted -- that data was needed to train the AI algorithm. Thanks to the algorithm, Souza and his colleagues should now be able to update their map every year with relative ease.
. . .
“Large areas of road-free rainforest are important for protecting Amazonian biodiversity and isolated indigenous people, said Souza. Moreover, roads are often a harbinger of further destruction. Nearly 95% of deforestation in the Brazilian Amazon occurs within 5.5 km of a road or 1 km of a river, while about 95% of fires occur within 10 km of a road or river, according to prior research by Souza and his colleagues. Loggers and gold miners often abandon private roads when natural resources are exhausted, said Souza, whereupon farmers and ranchers make use of them for further development.
If policymakers don't consider unofficial roads, they may underestimate the harm being done to the Amazon, said Souza. The new algorithm could help provide a complete and up-to-date picture, showing where to focus efforts at rainforest protection.”
“Hundreds of previously unreported releases of raw sewage into UK rivers have been detected thanks to artificial intelligence, researchers say.
Scientists identified 926 "spill events" from two wastewater treatment plants over an 11-year period by employing machine learning.
. . .
The researchers, who published their study in the journal Clean Water, trained a computer algorithm to recognise, through the pattern of flow through a treatment plant, when a spill was happening.
The researchers say that water companies around the UK could put a similar approach in place at any plant to detect "spills that appear to be going unnoticed and unreported".”
““The inspiration for Payver came in 2017 when the UDOT executive director set a goal that it would be the first department in the country to have real-time situational awareness on our roadways, and we’ve been working on solving that problem for them,” Pittman told TechCrunch. “They want to know what’s happening and when it’s happening automatically so the public doesn’t have to be involved. So if there’s roadside debris or stop signs missing or paint lines that need to be fixed, how does the department know without the public having to call and complain or without an accident occurring?”
Blyncsy’s Payver technology works by collecting any kind of HD images and videos from a variety of sources, such as Nexar dash cameras, and analyzing the data sets with machine vision to deliver output to customers. The insights are available to transit agencies in a dashboard format, but Payver also integrates into the maintenance management software that determines a rank order of repair jobs.”
“Their system represents a fundamentally different approach to air quality monitoring compared with the stationary systems routinely used in urban areas, which the group says often fail to detect spatial heterogeneity in pollution levels across a landscape. Given their limited distribution and lack of mobility, these systems are really only a reliable indicator of the air quality directly surrounding each monitoring point, but their data are reported as though they were representative of air quality across the entire city, say the recent graduates.
“So even though they might say that your air quality is somewhat good, that may not be the case for the park right next to your home,” says Gonzalez-Diaz.
The NEET cohort’s drone system is designed to provide real-time air quality data with a 15-meter resolution that is publicly accessible through a user-friendly interface.”
“Researchers from the University of Cambridge have demonstrated how a typical touchscreen could be used to identify common ionic contaminants in soil or drinking water by dropping liquid samples on the screen, the first time this has been achieved. The sensitivity of the touchscreen sensor is comparable to typical lab-based equipment, which would make it useful in low-resource settings.
The researchers say their proof of concept could one day be expanded for a wide range of sensing applications, including for biosensing or medical diagnostics, right from the phone in your pocket.
. . .
One early application for the technology could be to detect arsenic contamination in drinking water. Arsenic is another common contaminant found in groundwater in many parts of the world, but most municipal water systems screen for it and filter it out before it reaches a household tap. However, in parts of the world without water treatment plants, arsenic contamination is a serious problem.
“In theory, you could add a drop of water to your phone before you drink it, in order to check that it’s safe,” said Daly.”
"We've found that most businesses and people have the right intentions about recycling, but oftentimes they just don't know what the proper way to recycle is," Gates, CEO of Compology, told CNN Business' Rachel Crane.
To help them do it correctly, Compology puts trash-monitoring cameras and sensors inside industrial waste containers. The cameras take photos several times each day and when the container is lifted for dumping. An accelerometer helps trigger the camera on garbage day.
AI software analyzes the images to figure out how full the container is and can also let a customer know when something is where it shouldn't be, such as a bag of trash tossed into a dumpster filled with cardboard boxes for recycling. Gates said the company's cameras can cut the amount of non-recyclable materials thrown in waste containers by as much as 80%.”
“In a recent study, a combined team from Universitat Autònoma de Barcelona (UAB), the Institute of Economic Analysis at the Spanish National Research Council and Chapman University, California successfully automated this process for the analysis of heavy weaponry impacts – with profound implications for the surveillance of conflict zones for humanitarian ends.
The team used a convolutional neural network (CNN) to automate the photo analysis, co-author André Groeger explains. Trained on sequences of satellite images from Aleppo and five other Syrian cities between 2011 and 2017, as well as human-annotated data on destruction acquired from the United Nations Satellite Team (UNOSAT), the model successfully traced the progression of war damage over the course of the civil war with a level of precision closely rivalling that of manual approaches.”
“Developing countries are most at risk from Illegal, unreported, and unregulated (IUU) fishing, with estimated actual catches in West Africa, for example, being 40 percent higher than reported catches. Worldwide, one in five wild-caught fish is likely to be illegal or unreported; the economic value of these fish never reaches the communities that are the rightful beneficiaries. Annual global losses due to this illegal activity are valued at $10 billion to $23.5 billion USD.”
. . .
Synthetic aperture radar (SAR) is one of the power tools of remote sensing, and an increasingly valuable complement to other vessel detection systems. Active satellite sensors, such as SAR, transmit radar waves to the Earth and measure the backscatter and traveling time of the signals that are reflected back from objects on the ground.
. . .
For xView3, we created a free and open large-scale dataset for maritime detection, and the computing capability required to generate, evaluate and operationalize computationally intensive AI/ML solutions at global scale. The data are consistently processed to include aligned views and relevant context above and below the ocean surface, with ground truth detections derived by combining AIS tracks, existing automated SAR analysis, and human visual detections.”
“In New York City, if you report an idler and they’re found guilty, you get 25% of the fine, which ranges from $350 to $2,000. This started in February 2018, largely thanks to the efforts of George Pakenham, a Wall Street banker and part-time clean air activist living on the Upper West Side. Pakenham was the subject of the 2012 documentary film Idle Threat: Man on Emission. In that movie, he explains that while New York had an idling ordinance that goes back to the 1970s, it was considered a “moving violation,” meaning only police could enforce it, not parking inspectors. Pakenham, with help from the New York Environmental Defense Fund, was successful in getting that changed, so the city’s army of parking inspectors could help with enforcement.”
“Their platform allows these drones, that have individual digital signatures and operate under a designated protocol, to independently assess the chemical composition of water via built-in sensors. Available parameters are as follows: – pH, oxygen levels, conductivity, temperature and other indicators of various elements.
The project is based on the idea of a decentralized network, where sensor-equipped devices collect data and send it to a distributed ledger for safe storage. In other words, it is a combination of blockchain and collective intelligence of a decentralized self-managed drone infrastructure that acts as a multi-level solution to the problem.
It is a swarm of drones that perform joint monitoring and cross verify each other’s results in a bid to eliminate false alarms and provide authentic and precise data.
Once this data is received, it is the IPFS and the Ethereum blockchain technologies that are used to secure the data. The first one guarantees that the information remains unchanged, whilst the second one stores the information on the sensors that collected the data and the time of it being registered.”
“The team pairs old technology with the latest in computing. Using a submersible digital holographic microscope, they take a 2D image. They then use a machine learning system known as a neural network to convert the 2D image into a representation of the microbiome present in the 3D environment. “Using a machine learning network, you can take a 2D image and reconstruct it almost in real time to get an idea of what the microbiome looks like in a 3D space,” says Xia.
The software can be run in a small Raspberry Pi that could be attached to the holographic microscope. To figure out how to communicate these data back to the research team, Xia drew upon her master’s degree research. In that work, under the guidance of Professor Allan Adams and Professor Joseph Paradiso in the Media Lab, Xia focused on developing small underwater communication devices that can relay data about the ocean back to researchers.
Rather than the usual $4,000, these devices were designed to cost less than $100, helping lower the cost barrier for those interested in uncovering the many mysteries of our oceans. The communication devices can be used to relay data about the ocean environment from the machine learning algorithms.
By combining these low-cost communication devices along with microscopic images and machine learning, Xia hopes to design a low-cost, real-time monitoring system that can be scaled to cover entire seaweed farms.”
“The solution utilises data from microphones and installed cameras, used as IoT sensors along the road. If an approaching vehicle exceeds the pre-determined threshold, the street-deployed microphones and cameras begin recording.
Nokia Scene Analytics adds intelligence to the event data transmitted from the sensors using a decibel-powered algorithm for audio analysis and automated number plate recognition (ANPR). This information is sent to authorities who receive quantified observations and orientations in order to make informed decisions on ‘if’ and ’how’ they will address the issue.”
“Once they’ve purchased a setup, users receive a cryptocurrency called $PLANETS in exchange for running the sensors in their homes. The coin, which currently trades for roughly 34 cents each with a $51 million dollar market cap, can be used to purchase more sensors, or resold on crypto marketplaces as a stream of passive income. For Type 4 sensors (the cheapest), the maximum reward to the user is 23 tokens per day (amounting to roughly $8 per day at current rate), but for more expensive Type 1 sensors, that rises to 166 ($45 per day). However, a cap on daily total token distribution, as well as a formula of declining rewards based on density of nearby sensors, means that the more users join, the smaller the rewards. “
“In a paper published on October 17th in Nature Communications, a group of researchers led by Jörg Müller, an ecologist at the University of Würzburg, describe a better way: have a computer do the job. Smartphone apps already exist that will identify birds, bats or mammals simply by listening to the sounds they make. Their idea was to apply the principle to conservation work.“
“The groundbreaking study, led by Global Fishing Watch, uses machine learning and satellite imagery to create the first global map of large vessel traffic and offshore infrastructure, finding a remarkable amount of activity that was previously “dark” to public monitoring systems.
The analysis reveals that about 75 percent of the world’s industrial fishing vessels are not publicly tracked, with much of that fishing taking place around Africa and south Asia. More than 25 percent of transport and energy vessel activity are also missing from public tracking systems.“
“Artificial intelligence will be used for the first time to track hedgehog populations as part of a pioneering project aimed at understanding how many of them are left in the UK and why they have suffered a decline.
Images of the prickly mammals snuffling around urban parks, private gardens, woodlands and farmland will be captured by cameras and filtered by AI trained to differentiate between wildlife and humans.
The images will then be sent to human “spotters” who will pick out those featuring hedgehogs and send them to analysts, who will record the numbers and locations.”