AI News, Sustainability Impact Assessment in Support of the Negotiations for ... artificial intelligence

Home | Greens/EFA

The European Commission has unveiled its 'blacklist' of 23 jurisdictions where there is a significant risk of money laundering.

list will require all traders in the EU doing business in goods or services worth more than €10,000 in jurisdictions on the blacklist to undertake enhanced due diligence on transactions. The

February Satuccino

It has developed techniques to process vast quantities of multi-modal sensor data in near real-time for infrastructure monitoring and autonomous systems and is developing processes to overcome environmental limitations of sensors, such as those operating in poor visibility, caused by fog, heavy rain, and snow.

whether that’s balancing a power budget, the cost of current power systems or the limitation of current payloads Earthscope enables the agricultural sector to monitor crops and predict yield using satellite data and machine learning combined with unique ground truth data.

 If you your business is applying Artificial Intelligence in the Space sector or you’d like more information about how to get involved please contact her Teloconsulting is an independent firm that advises on the impact of strategic finance, regulatory and commercial issues in digital communications, radio spectrum and space sectors.

If you are an SME and would like to find out more, please get in touch with Martine: or take a look at the website: Space Store aims to be the retailer for the Space Industry and bring space experience to everyone, everywhere, every day.  We’ll be launching our new e-commerce site for space organisations in a few weeks’ time, check out the draft site at: If you’d like to partner with us, please get in touch with The UK Space Agency’s Local Growth Team is responsible for ensuring every region of the UK realises the potential of our growing space economy.

The report also reveals an increase in the number of organisations involved in sspace-relatedactivities right across the UK.  We are committed to building on this success and the survey will help us and our partner organisations support growth in the space economy and realise the potential of the sector across the whole UK.

The full report can be found here: You can contact the local growth team: MULTIPLY enables space companies to scale fast, reaching new markets and ultimately creating a thriving space industry globally.

MULTIPLY provide three key services built on over 50 years of experience that sees companies not only transform their balance sheets but also their industry: MULTIPLY is multiplying fast with a broad cross-industry team including MOD, corporate and start-up expertise, we are offering companies the opportunity to take advantage of a free 20 minute video conference consultation until the end of March 2019 on what scaling could look like for your company.

University of Tasmania

The Research Project This project is part of a recently funded Australian Research Council Discovery Project (DP190102020) to the supervisors and two international experts, Dr Sue Vandewoude (Colorado State University) and Dr Meggan Craft (University of Minnesota).DP190102020 addresses a crucial issue in understanding the spread of infectious disease in wildlife: that we simply cannot observe an entire population.We will investigate ways to make better epidemiological predictions in wildlife (such as Tasmanian devils, wombats, and American mountain lions and bobcats –

We model contact networks with weighted networks (graphs), whose vertices are individual animals, and whose edges are epidemiologically significant contacts between them (i.e., contacts where disease transmission could occur).

that could contain given (observed) CNs, and to each assign a likelihood, using a model we have already developed.This will form part of a Markov chain Monte Carlo (MCMC) sampling process to find out characteristics of the most likely true networks for a given observed sub-network.

Global catastrophic risk

A global catastrophic risk is a hypothetical future event which could damage human well-being on a global scale,[2]

Potential global catastrophic risks include anthropogenic risks, caused by humans (technology, governance, climate change), and non-anthropogenic or external risks.[3]

Insufficient or malign global governance creates risks in the social and political domain, such as a global war, including nuclear holocaust, bioterrorism using genetically modified organisms, cyberterrorism destroying critical infrastructure like the electrical grid;

Problems and risks in the domain of earth system governance include global warming, environmental degradation, including extinction of species, famine as a result of non-equitable resource distribution, human overpopulation, crop failures and non-sustainable agriculture.

Examples of non-anthropogenic risks are an asteroid impact event, a supervolcanic eruption, a lethal gamma-ray burst, a geomagnetic storm destroying electronic equipment, natural long-term climate change, hostile extraterrestrial life, or the predictable Sun transforming into a red giant star engulfing the Earth.

An existential risk, on the other hand, is one that either destroys humanity (and, presumably, all but the most rudimentary species of non-human lifeforms and/or plant life) entirely or at least prevents any chance of civilization recovering.[6]

In addition, it is one thing to estimate the likelihood of an event taking place, something else to assess how likely an event will cause extinction if it does occur, and most difficult of all, the risk posted by synergistic effects of multiple events taking place simultaneously.[citation needed]

The conference report cautions that the results should be taken 'with a grain of salt', the results were not meant to capture all large risks and did not include things like climate change, and the results likely reflect many cognitive biases of the conference participants.

Unlike with most events, the failure of a complete extinction event to occur in the past is not evidence against their likelihood in the future, because every world that has experienced such an extinction event has no observers, so regardless of their frequency, no civilization observes existential risks in its history.[8]

These anthropic issues can be avoided by looking at evidence that does not have such selection effects, such as asteroid impact craters on the Moon, or directly evaluating the likely impact of new technology.[5]

Martin Weitzman argues that most of the expected economic damage from climate change may come from the small chance that warming greatly exceeds the mid-range expectations, resulting in catastrophic damage.[22]

Numerous cognitive biases can influence people's judgment of the importance of existential risks, including scope insensitivity, hyperbolic discounting, availability heuristic, the conjunction fallacy, the affect heuristic, and the overconfidence effect.[24]

For example, when people are motivated to donate money to altruistic causes, the quantity they are willing to give does not increase linearly with the magnitude of the issue: people are roughly as concerned about 200,000 birds getting stuck in oil as they are about 2,000.[25]

Furthermore, the vast majority of the benefits may be enjoyed by far future generations, and though these quadrillions of future people would in theory perhaps be willing to pay massive sums for existential risk reduction, no mechanism for such a transaction exists.[5]

It has been suggested that learning computers that rapidly become superintelligent may take unforeseen actions, or that robots would out-compete humanity (one technological singularity scenario).[29]

Because of its exceptional scheduling and organizational capability and the range of novel technologies it could develop, it is possible that the first Earth superintelligence to emerge could rapidly become matchless and unrivaled: conceivably it would be able to bring about almost any possible outcome, and be able to foil virtually any attempt that threatened to prevent it achieving its objectives.[30]

In 2009, the Association for the Advancement of Artificial Intelligence (AAAI) hosted a conference to discuss whether computers and robots might be able to acquire any sort of autonomy, and how much these abilities might pose a threat or hazard.

They noted that some robots have acquired various forms of semi-autonomy, including being able to find power sources on their own and being able to independently choose targets to attack with weapons.

Various media sources and scientific groups have noted separate trends in differing areas which might together result in greater robotic functionalities and autonomy, and which pose some inherent concerns.[34][35]

survey of AI experts estimated that the chance of human-level machine learning having an 'extremely bad (e.g., human extinction)' long-term effect on humanity is 5%.[36]

biotechnology catastrophe may be caused by accidentally releasing a genetically engineered organism from controlled environments, by the planned release of such an organism which then turns out to have unforeseen and catastrophic interactions with essential natural or agro-ecosystems, or by intentional usage of biological agents in biological warfare, bioterrorism attacks.[38]

They argue that risks from biological warfare and bioterrorism are distinct from nuclear and chemical threats because biological pathogens are easier to mass-produce and their production is hard to control (especially as the technological capabilities are becoming available even to individual users).[38]

Noun and Chyba propose three categories of measures to reduce risks from biotechnology and natural pandemics: Regulation or prevention of potentially dangerous research, improved recognition of outbreaks and developing facilities to mitigate disease outbreaks (e.g.

scarcity of water that could lead to approximately one half of the Earth's population being without safe drinking water, pollinator decline, overfishing, massive deforestation, desertification, climate change, or massive water pollution episodes.

An October 2017 report published in The Lancet stated that toxic air, water, soils, and workplaces were collectively responsible for 9 million deaths worldwide in 2015, particularly from air pollution which was linked to deaths by increasing susceptibility to non-infectious diseases, such as heart disease, stroke, and lung cancer.[46]

Biotechnology could lead to the creation of a pandemic, chemical warfare could be taken to an extreme, nanotechnology could lead to grey goo in which out-of-control self-replicating robots consume all living matter on earth while building more of themselves—in both cases, either deliberately or by accident.[55]

Projections of future climate change suggest further global warming, sea level rise, and an increase in the frequency and severity of some extreme weather events and weather-related disasters.

Effects of global warming include loss of biodiversity, stresses to existing food-producing systems, increased spread of known infectious diseases such as malaria, and rapid mutation of microorganisms.

In November 2017, a statement by 15,364 scientists from 184 countries indicated that increasing levels of greenhouse gases from use of fossil fuels, human population growth, deforestation, and overuse of land for agricultural production, particularly by farming ruminants for meat consumption, are trending in ways that forecast an increase in human misery over coming decades.[3]

Romanian American economist Nicholas Georgescu-Roegen, a progenitor in economics and the paradigm founder of ecological economics, has argued that the carrying capacity of Earth — that is, Earth's capacity to sustain human populations and consumption levels — is bound to decrease sometime in the future as Earth's finite stock of mineral resources is presently being extracted and put to use;

and consequently, that the world economy as a whole is heading towards an inevitable future collapse, leading to the demise of human civilization itself.[56]:303f Ecological economist and steady-state theorist Herman Daly, a student of Georgescu-Roegen, has propounded the same argument by asserting that '...

Ever since Georgescu-Roegen and Daly published these views, various scholars in the field have been discussing the existential impossibility of allocating earth's finite stock of mineral resources evenly among an unknown number of present and future generations.

In effect, any conceivable intertemporal allocation of the stock will inevitably end up with universal economic decline at some future point.[58]:253–256 [59]:165 [60]:168–171 [61]:150–153 [62]:106–109 [63]:546–549 [64]:142–145 [65]

Molecular manufacturing requires significant advances in nanotechnology, but once achieved could produce highly advanced products at low costs and in large quantities in nanofactories of desktop proportions.[66][67]

Detonating large numbers of nuclear weapons would have an immediate, short term and long-term effects on the climate, causing cold weather and reduced sunlight and photosynthesis[80]

However, while popular perception sometimes takes nuclear war as 'the end of the world', experts assign low probability to human extinction from nuclear war.[82][83]

Should the world's large grain-producing areas become infected, the ensuing crisis in wheat availability would lead to price spikes and shortages in other food products.[91]

In April 2018, the B612 Foundation reported 'It's a 100 per cent certain we'll be hit [by a devastating asteroid], but we're not 100 per cent sure when.'[97][98]

If our universe lies within a false vacuum, a bubble of lower-energy vacuum could come to exist by chance or otherwise in our universe, and catalyze the conversion of our universe to a lower energy state in a volume expanding at nearly the speed of light, destroying all that we know without forewarning.

Several renowned public figures such as Stephen Hawking and Elon Musk have argued against sending such messages on the grounds that extraterrestrial civilizations with technology are probably far more advanced than humanity and could pose an existential threat to humanity.[118]

The present, unprecedented scale and speed of human movement make it more difficult than ever to contain an epidemic through local quarantines, and other sources of uncertainty and the evolving nature of the risk means natural pandemics may pose a realistic threat to human civilization[18].

This argument has been disputed on several grounds, including the changing risk due to changing population and behavioral patterns among humans, the limited historical record, and the existence of an anthropic bias.[18]

Any pathogen with a high virulence, high transmission rate and long incubation time may have already caused a catastrophic pandemic before ultimately virulence is limited through natural selection.

Additionally, a pathogen that infects humans as a secondary host and primarily infects another species (a zoonosis) has no constraints on its virulence in people, since the accidental secondary infections do not affect its evolution.[122]

Experts have concluded that 'Developments in science and technology could significantly ease the development and use of high consequence biological weapons,' and these 'highly virulent and highly transmissible [bio-engineered pathogens] represent new potential pandemic threats.'[125]

These global climatic changes occurred slowly, prior to the rise of human civilization about 10 thousand years ago near the end of the last Major Ice Age when the climate became more stable.

massive volcano eruption would eject extraordinary volumes of volcanic dust, toxic and greenhouse gases into the atmosphere with serious effects on global climate (towards extreme global cooling: volcanic winter if short-term, and ice age if long-term) or global warming (if greenhouse gases were to prevail).

According to a recent study, if the Yellowstone caldera erupted again as a supervolcano, an ash layer one to three millimeters thick could be deposited as far away as New York, enough to 'reduce traction on roads and runways, short out electrical transformers and cause respiratory problems'.

The main long-term effect is through global climate change, which reduces the temperature globally by about 5–15 degrees C for a decade, together with the direct effects of the deposits of ash on their crops.

A large supervolcano like Toba would deposit one or two meters thickness of ash over an area of several million square kilometers.(1000 cubic kilometers is equivalent to a one-meter thickness of ash spread over a million square kilometers).

Research published in 2011 finds evidence that massive volcanic eruptions caused massive coal combustion, supporting models for significant generation of greenhouse gases.

Massive eruptions can also throw enough pyroclastic debris and other material into the atmosphere to partially block out the sun and cause a volcanic winter, as happened on a smaller scale in 1816 following the eruption of Mount Tambora, the so-called Year Without a Summer.

Such an eruption might cause the immediate deaths of millions of people several hundred miles from the eruption, and perhaps billions of death worldwide, due to the failure of the monsoons[140], resulting in major crop failures causing starvation on a profound scale.[140]

Within the scope of these approaches, the field of geoengineering encompasses the deliberate large-scale engineering and manipulation of the planetary environment to combat or counteract anthropogenic changes in atmospheric chemistry.

More speculatively, if society continues to function and if the biosphere remains habitable, calorie needs for the present human population might in theory be met during an extended absence of sunlight, given sufficient advance planning.

Conjectured solutions include growing mushrooms on the dead plant biomass left in the wake of the catastrophe, converting cellulose to sugar, or feeding natural gas to methane-digesting bacteria.[145][146]

There are concerns from governments, the private sector, as well as the general public about the lack of governance mechanisms to efficiently deal with risks, negotiate and adjudicate between diverse and conflicting interests.

Beginning after 2000, a growing number of scientists, philosophers and tech billionaires created organizations devoted to studying global risks both inside and outside of academia.[151]

Shaping a 21st Century Workforce – Is AI Friend or Foe?

Visit: 0:30 - Introduction by Rui de Figueiredo 5:04 - Main Talk - Jennifer Granholm 51:18 - Audience Questions Jennifer Granholm, former ..

Why you are wrong about Universal Basic Income | The power of AI within the hands of the few

Too many people seem to unite behind the idea of universal basic income with the assumption of greater freedom and more spare time. But they don't realize ...

How to make the world better. Really. With Dr. Bjorn Lomborg.

On December 7, 2018, I spoke with Dr. Bjorn Lomborg, author and President of Copenhagen Consensus Center, a singularly innovative and influential ...

Ethics of AI

Detecting people, optimising logistics, providing translations, composing art: artificial intelligence (AI) systems are not only changing what and how we are doing ...

Why growth and the environment can't coexist

Featuring: Sam Bliss Production: Daniel Penner Animation + Illustration: Amelia Bates Music: "Nincompoop (No Vocals)" by Josh Woodward ...

The Cognitive Era: Artificial Intelligence and Convolutional Neural Networks

Jim Hogan, Raik Brinkmann, James Gambale Jr., Chris Rowen and Drew Wingard discuss artificial intelligence and convolutional neural networks at the Diaz ...

Can Technology Improve Quality of Life in Cities | Talks at Google

Real estate and urban technology leaders convene at Google to discuss how new technology may be used to improve urban living. Panelists include Craig ...

조던 피터슨 | 옥스포드 유니언 강연 | 정치적 올바름, 위계, 의미 등에 관하여

원본 영상: 원본 영상에 자막을 추가할 권한을 커뮤니티에 제공하지 않아서 한글 자막을 만들어서 제공할..

Paul Mason: "PostCapitalism" | Talks at Google

Paul Mason joined us in London to talk about his book PostCapitalism: A Guide to our Future. Recorded in December 2015, London. About the book (press ...

HHS Artificial Intelligence Industry Day Part 1

HHS has launched a large scale "Reimagine HHS-BUYSMARTER Acquisitions" initiative to procure an AI solution that generates efficiencies & provides cost ...