AI News, New Funding Opportunities

RegTech Universe

RegTech, the “new Fintech” as we have called it, promises to disrupt the regulatory landscape by providing technologically advanced solutions to the ever increasing demands of compliance within the financial industry.

You can follow us on LinkedIn and Twitter to be sure to catch our next updates, and we would of course be more than happy to hear your take on RegTech, so please do not hesitate to get in touch with us.

Four Fields of Study in Canada

With ten distinct provinces and three territories, spanning an area of 9.98 million square kilometres, Canada is the second largest country in the world by total area.

According to an OECD Better Life Index study, “When asked to rate their general satisfaction with life on a scale from 0 to 10, Canadians gave it a 7.4 grade on average, higher than the OECD average of 6.5.”

Scholarships are available, but perhaps the most attractive draw for some of the top ten degree fields in Canada is that it’s easier access to a residence permit for students and PhD graduates through Canada’s express entry program.

You’ll also find programs with a more affordable average annual tuition and living expenses price tag compared to Canada’s neighbor, the United States, and the country is also cheaper in terms of some degree programs in the United Kingdom and Australia.

Students of AI, a computer science field of study,learn to develop systems and learn how to experiment with machine intelligence, which, in this case, is privileged over human intelligence.

As a multidisciplinary field of study, you will get the chance to immerse yourself in many facets of this field that will give you the tools to potentially help solve the most pressing environmental problems our species and the world are facing today.

Rest assured, the government of Canada, prioritizes this field of study, writing, ”Canada assesses all of its development assistance activities for potential risks and opportunities with respect to environmental sustainability and works with its partner countries to ensure that they have the capacity to do the same.

According to the HuffPost, the average salary for a nurse in 2017 was $84,510, and, “Job markets in health care are predicted to remain stable, especially given Canada's aging population.”

Jasmine Alami, award-winning nursing graduate from the Ingram School of Nursing at McGill University, said, “I’ve always known that I enjoy working in collaboration with people, but it’s while I was volunteering at the hospital during my bachelor’s at McGill that I uncovered the amazing work nurses do for patients.

Don’t forget to pay attention to the prerequisites specific to programs in Canada, and try to do a campus visit if you can afford it to get a sense of the facilities and the student body.

Frontiers in Neuroscience

In this article, we argue for Biomimetic Research for Energy-efficient AI Designs (BREAD) as AI moves toward edge computing in remote environments far away from conventional energy sources, and as energy consumption becomes increasingly expensive.

For example, Google projected in 2013 that people searching by voice for three minutes a day using speech recognition deep neural networks would double their datacenters'

The rise of highly efficient data factories, known as hyperscale facilities, use an organized uniform computing architecture that scales up to hundreds of thousands of servers.

While these hyperscale centers can be optimized for high computing efficiency, there are limits to growth due to a variety of constraints that also affect other electrical grid consumers.

However, the shift to hyperscale facilities is a current trend, and if 80% of servers in US conventional data centers were moved over to hyperscale facilities, energy consumption would drop by 25%, according to Lawrence Berkeley National Laboratory report, 2016.

For example, Google employs a cloud-based AI to collect information about the data center cooling system from thousands of physical sensors prior to feeding this information into deep neural networks.

Although hyperscale centers and smart cooling strategies can lower energy consumption, these solutions do not address applications where AI is operating at the edge or when AI is deployed in extreme conditions far away from convenient power supplies.

While there is a growing research effort toward developing efficient machine learning methods for embedded and neuromorphic processors (Esser et al., 2015;

There are proposals to use nuclear fission power generation to enable deep-sea battery recharging stations for military AUV's, though these remain at the development stage and have similar safety considerations to those mentioned above for space (Hambling, 2017).

Operating in such domains will have the additional challenge of energetic constraints because readily available solar power may not always be available in domains such as Earth's moon, solar system planets with weather and deep space (including interstellar).

These constraints include the essential role of glucose as fuel under conditions of non-starvation, the continuous demand for approximately 20% of the human body's total energy utilization, and the lack of any effective energy-reserve among others (Sokoloff, 1960).

While being severely metabolically constrained is at one level a disadvantage, evolution has optimized brains in ways that lead to incredibly efficient representations of important environmental features that stand distinct from those employed in current digital computers.

Neural activity, (i.e., the generation of an action potential, the return to resting state, and synaptic processing) is energetically very costly, and this can drive the minimization of the number of spikes necessary to encode either an engram or the neural representation of a new stimulus (Levy and Baxter, 1996;

Because brains face strict constraints on metabolic cost (Lennie, 2003) and anatomical bottlenecks (Ganguli and Sompolinsky, 2012), which often force the information stored in a large number of neurons to be compressed into an order of magnitude smaller population of downstream neurons (e.g., storing information from 100 million photoreceptors in 1 million optic nerve fibers), reducing the number of variables required to represent a particular stimulus space figures prominently in efficient current coding theories of brain function (Linsker, 1990;

For example, the nervous system can respond quickly to perturbations by shifting the specific timing rather than increasing the absolute number of spikes (Malyshev et al., 2013).

In these ways, the overall energy utilization of the human brain stays relatively constant, while the local rate of energy consumption varies widely and is dependent upon functional neuronal activity and the balance between excitatory and inhibitory neurons (Olds et al., 1994).

At a macroscopic scale, the brain saves energy by minimizing the wiring between neurons and brain regions (i.e., number of axons), yet still communicates information at a high-level of performance (Laughlin and Sejnowski, 2003).

White matter, which are myelinated axons that transmit information over long distances in the nervous system, make up about half the human brain but use less energy than gray matter (neuronal somata and dendrites) because of the scarcity of ion channels along these axons (Harris and Attwell, 2012).

The idea of minimizing free energy has close ties to many existing brain theories, such as the Bayesian brain, predictive coding, cell assemblies, and Infomax, as well as an evolutionary-inspired theory called Neural Darwinism or neuronal group selection (Friston, 2010).

For field robotics, a predictive controller could allow the robot to reduce unplanned actions (e.g., obstacle avoidance) and produce more efficient behaviors (e.g., optimal foraging or route planning).

In summary, the brain represents an important existence proof that extraordinarily efficient natural intelligence can compute in very complex ways within harsh, dynamic environments.

key component in pursuing brain- and neural- inspired computing, coding, and neuromorphic algorithms lies in the currently shifting landscape of computing architectures.

For modeling biological neural systems, the performance improvements already may be considerable (e.g., the Neurogrid platform claims 5 orders of magnitude efficiency improvement compared to a personal computer Benjamin et al., 2014).

32 pixel) benchmark tasks on TrueNorth resulted in approximately 6,000 to 10,000 frames/Second/Watt, whereas the Nvidia Jetson TX1 (an embedded GPU platform) can process between 5 and 200 (ImageNet) frames per second at approximately 10–14 Watts net power consumption (Canziani et al., 2016).

Although we note that it is difficult to have fair network and dataset parity across platforms, and that neuromorphic systems supporting even millions of neurons may be too small-scale for application-level machine learning problems.

This event-driven, distributed, processor-in-memory approach provides robust, low-power processing compatible with many neural-inspired machine learning and artificial intelligence applications (Neftci, 2018).

Hence, size, weight and power (SWaP) constrained environments, such as edge and IoT devices, can leverage increased effective remote computation capabilities and provide real-time, low-latency intelligent and adaptive behavior.

Moreover, the often noisy nature of learned artificial intelligence systems (some incorporate noise by design Srivastava et al., 2014) may lead to more robust computation in extreme environments such as space.

Jouppi et al., 2017), and new neuromorphic architectures, such as Intel's Loihi, are themselves heterogeneous which improves communication between neural and conventional cores (Davies et al., 2018).

Emerging neural computing platforms may benefit traditional large-scale computation both indirectly [e.g., system health (Das et al., 2018), failure prediction (Bouguerra et al., 2013)] and directly (e.g., meshing, surrogate models Melo et al., 2014), and recent work indicates that neuromorphic processors may be useful for direct computation due their high-communication, highly-parallel nature (Lagorce and Benosman, 2015;

Moreover, some of the themes from the Existence proof, human brains as efficient energy consumers section (e.g., minimizing wiring, keeping firing rates constant, using sparse and reduced representations) could be incorporated into neuromorphic designs.

However, spiking sensors are innately compatible with spiking neuromorphic processors, and combining neuromorphic sensors with a neuromorphic processor can avoid the costly conversion between binary data formats and spikes.

These learning devices, extensions of current trends in smart devices (e.g., digital assistants, smart home control, wearables), will be enhanced with personalized online learning and enabled with adaptive, intelligent and context-dependent perception and behaviors.

Both designing algorithms to mimic the brain's behavior, and building new computer hardware that mimic neural dynamics can lead to energy efficiency (see Figure 3) (Calimera et al., 2013).

However, many of the brain's energy efficiency strategies, such as minimizing wiring, maintaining constant activity and prediction outcomes, are not implemented in current neuromorphic architectures and should be explored in the future.

This selectionist approach, which was inspired by the immune system, led to an influential brain theory where the synaptic selection took place during neural development and through experiential synaptic plasticity (Edelman, 1987, 1993).

As edge computing and mobile sensing devices become ubiquitous, efficient mobility, whether on land, air, or water will become increasingly important.

Figure 4 shows examples of how biological organisms have evolved to leverage their environment, and this morphological computation can lead to efficient movement and information processing (Pfeifer et al., 2014).

For example, swarm intelligence (see Figure 4, panel 2), which is inspired by social insects, can solve a number of problems with a collection of low power simple agents.

Bipedal walking is somewhat of a controlled fall, where energy is conserved by allowing gravity to take over after the swing phase of a step (see Figure 4, panel 3).

Birds of prey and long-range migrating birds take advantage of thermal plumes to reduce energy usage during flight (see Figure 4, panel 1) (Akos et al., 2010;

Some fish species and flying insects alter their environment (i.e., the water or air vortices) to create additional thrust (Triantafyllou et al., 2002;

Taken together, future AI systems that take inspiration from biology and other energy harvesting approaches will have a distinct advantage for long-term operation in harsh or remote environments.

Even without the development of General Artificial Intelligence, the trend is toward human-machine partnerships that collectively will have the ability to substantially extend the reach of humans in multiple domains (e.g., space, cyber, deep sea, nano).

To coordinate investments and channel knowledge from the life sciences to AI energetics into a holistic AI design, we advocate the launch of a global technological innovation initiative, which we call Biomimetic Research for Energy-efficient, AI Designs (BREAD).

We believe that a more fruitful approach for AI design is to leverage the solutions evolved by biology (nervous system, metabolism, morphology) in the future AI design, to what we call ‘biomimetic strategies.'

Since the initial costs of integrating biomimetic solutions into AI are likely to be front-loaded, we recommend that stakeholder industrial partners with governments establish a pre-competitive research laboratory that preserves intellectual property, similar to IMEC in Belgium.

Upgrading the Particle Physics Toolkit: The Future Circular Collider - Harry Cliff, John Womersley

When the LHC reaches the limits of its discovery potential in 2035, what happens next? John Womersley and Harry Cliff discuss the next mega-collider - the ...

Let's Make America Smart Again with Cory Booker | StarTalk Live! with Neil deGrasse Tyson

Get your "Let's Make America Smart Again" sticker now! StarTalk Radio is continuing its mission to Make America Smart Again

Vinod Khosla - CDL Super Session 2019

CDL Super Session 2019 was where a global community of ambitious, imaginative people striving to transform the way research is commercialized converged.

2019 David J Rose Lecture

A conversation with Vinod Khosla (Khosla Ventures) hosted by Dennis Whyte (Head, Nuclear Science and Engineering & Director, Plasma Science and Fusion ...

The Secret Behind 5G Technology

What is 5G? 5G networks are the next generation of mobile internet connectivity, offering faster speeds and more reliable connections on smartphones and other ...

Alex Epstein: "The Moral Case for Fossil Fuels" | Talks at Google

Energy philosopher Alex Epstein, author of The Moral Case for Fossil Fuels, challenges conventional wisdom about the fossil fuel industry and argues that if we ...

Barnes Lecture 2019 - Jon Kabat-Zinn

In his talk, "The Public Health Roots of Mindfulness-Based Stress Reduction," Dr. Jon Kabat-Zinn described the core elements of MBSR, its roots in public health, ...

Healthcare in the Cloud: Avoid Risk and Address Security and Compliance with GCP (Cloud Next '19)

Healthcare organizations have to meet rigorous security, privacy, and compliance requirements. Many of these organizations are unsure about how to map and ...

Google Cloud Customer Innovation Series - Tuesday (Cloud Next '19)

Sometimes the best learning is customer to customer, learning and sharing best practices. In the Google Cloud Customer Innovation Series, you'll meet leading ...