AI News, Event Preview: Artificial Intelligence the 4th industrial revolution artificial intelligence
Artificial Intelligence and the Fourth Industrial Revolution — A conversation with Eric Schmidt
As Former Executive Chairman of Alphabet, Google’s parent company, Eric Schmidt has a unique perspective on the changes that AI, machine learning and neural nets might wreak or bestow on the economy and society.
He’s responsible for developing technology that could reshape professions and infrastructure, which means that he’s thought about questions of social change and AI ethics with a sense of urgency few of us have.
This was partly because computers weren’t fast enough at that time, but there were also procedural errors involved in the development of AI.” Then a series of people including Geoff Hinton and others invented deep neural networks, which allow you to classify through many layers.
Today, we’re at a point where computer vision is better than human vision, and language translation is nearing equivalence with human ability.
Machine learning can achieve some aspects of that, but AI is really about imagining systems that seem to have some aspects of intelligence.” Many of the short term gains that you’re about to see will involve relatively routine and repetitive operations.
I would pay dermatologists a dollar to categorise pictures of skin as ‘cancerous, non-cancerous, mole, etc’
We had incredibly smart folks working on tuning and designing data centres to achieve that efficiency.
When we used the algorithm I mentioned to you, TensorFlow, we saw a fifteen percent improvement over the best human tuner.
This was seriously humbling for those Google people — the engineers who built the AI system are incredibly proud of themselves.
There was a similar argument in America during the 1990s about Enterprise Resource Planning and Material Requirements Planning Systems and so forth — the computer people were going to take all the jobs.
Over the last twenty years, the American job system has produced an enormous number of jobs, ignoring the errors of 2008, and America has its lowest level of joblessness in 30 years.
I would argue that they’re more likely to get a job, and more likely to get a higher-paid job.” The manufacturing transition in the last thirty years has involved the replacement of crippling human jobs by non-crippled robots.
We decry the loss of those jobs, but presumably the robots are a lot healthier than the humans whose backs were broken in steel mills over the last 100 years.
I’m not suggesting that there isn’t very serious dislocation here or that some people don’t wish to participate in this change — I understand those sentiments.
The general academic consensus is that jobs involving boring, repetitive tasks will disappear or be altered, which is consistent with what occurred in manufacturing.
I assume you have them here in Britain — we certainly do in America.” The stereotype I have of a bureaucrat is a gentleman who’s constantly frustrated, spends all day filling out the same paperwork, comes in at nine and leaves at five.
The people in the faculty are of equal or greater quality — there’s no issue of human capital.
They’ll soon produce world-class universities — and by the way, the Chinese are just as smart as us, if not smarter.” The second factor is entrepreneurship.
Machine learning readily applies to large data sets, a clear optimisation function and a large training opportunity.
Optimists argue that if you apply machine learning to concepts and planning, you’ll eventually invent some form of imagination.
Get involved with the only AI business event powered by Industry Leaders
must for a senior technology exec to get out of the day to day business and hear about what is going on in the world and future for AI and software engineering An excellent event: the future explained!
Exceeded my expectations: The AI Summit is the space for likeminded business people to share current successes and for others to experience what is possible today, and meet vendors who can work with them immediately to take AI forward.
The best event in AI: the right people, inspirational technological developments and many AI initiatives in other industries This is biggest gathering of AI in the enterprise providing a sense of where the market is today, what are some of the key challenges and best practices of how to tackle AI in a business context The most insightful AI gathering in the year The only event for aligning business and AI!
The fourth industrial revolution: a primer on Artificial Intelligence (AI)
There are more than 15 approaches to machine learning, each of which uses a different algorithmic structure to optimise predictions based on the data received.
Some of the most effective machine learning algorithms beyond deep learning include: Each approach has its advantages and disadvantages and combinations may be used (an ‘ensemble’ approach).
With the right data we can build algorithms for myriad purposes including: suggesting the products a person will like based on their prior purchases;
Even with general machine learning — random forests, Bayesian networks, support vector machines and more — it’s difficult to write programs that perform certain tasks well, from understanding speech to recognising objects in images.
If we want to write a computer program that identifies images of cars, for example, we can’t specify the features of a car for an algorithm to process that will enable correct identification in all circumstances.
Deep learning is useful because it avoids the programmer having to undertake the tasks of feature specification (defining the features to analyse from the data) or optimisation (how to weigh the data to deliver an accurate prediction) — the algorithm does both.
But by freeing programmers from complex feature specification, deep learning has delivered successful prediction engines for a range of important problems.
8, below) At its output layer, based on its training the neural network will deliver a probability that the picture is of the specified type (human face: 97%;
Steps include structuring the network for a particular application, providing a suitable training set of data, adjusting the structure of the network according to progress, and combining multiple approaches.
AI is important because it tackles profoundly difficult problems, and the solutions to those problems can be applied to sectors important to human wellbeing — ranging from health, education and commerce to transport, utilities and entertainment.
Since the 1950s, AI research has focused on five fields of enquiry: AI is valuable because in many contexts, progress in these capabilities offers revolutionary, rather than evolutionary, capabilities.
Considering a single corporate function — for example, human resource (HR) activity within a company — illustrates the range of processes to which machine learning will be applied: Over time we expect the adoption of machine learning to become normalised.
The effectiveness of AI has been transformed in recent years due to the development of new algorithms, greater availability of data to inform them, better hardware to train them and cloud-based services to catalyse their adoption among developers.
While deep learning is not new — the specification for the first effective, multi-layer neural network was published in 1965 — evolutions in deep learning algorithms during the last decade have transformed results.
With additional connections and memory cells, RNNs ‘remember’ the data they saw thousands of steps ago and use this to inform their interpretation of what follows — valuable for speech recognition where interpretation of the next word will be informed by the words that preceded it.
Just six weeks ago, Microsoft engineers reported that their system reached a word error rate of 5.9% — a figure roughly equal to that of human abilities for the first time in history.
Artificial Intelligence: The fourth industrial revolution
AI’s roots are in the ‘expert systems’ of the ‘70s and ‘80s, computers that were programmed with a human’s ‘expert’ knowledge in order to allow decision-making based on the available facts.
Machine learning models need data… Just as we humans ‘learn’ our tacit knowledge through our experiences, by attempting a task again and again to gradually improve, ML models need to be ‘trained’.
Only once these steps of the journey have been completed can we truly progress to AI and machine learning, to gain further insight into the past and future performance of our organisations, and to help us solve business problems more efficiently.
Businesses can fuse employee and payroll data, absence records, training records, performance ratings and more to give a complete ‘picture’ of an employee’s interaction with the organisation.
The next stage is to use AI models to ‘predict’ those employees who might need some extra support or intervention – high-performers at risk of leaving, or people showing early signs of declining performance.
Social media analytics solutions can be used to analyse how customers and consumers view and react to the companies and brands they’re interacting with through social media.
The next stop on the AI journey enables powerful analysis of trends and consumer behaviour over time, allowing organisations to track and forecast customer engagement in real-time.
Disney is already collecting location data from wristbands at their attractions, predicting and managing queue lengths (suggesting other rides with shorted queues, or offering food/drink vouchers in busy times to reduced demand).
The ability to analyse increasingly creative and diverse data sources to unearth new insights is growing, but the ability to bring together these new, disparate data sources is key to realising their value.
the future – when and where crime is likely to happen, or the risk or vulnerability of individuals, allowing the police to direct limited resources as efficiently as possible.
Machine learning algorithms can be employed in a variety of ways – to automate facial recognition, to pinpoint crime hotspots, and to identify which people are more likely to reoffend.
Building a strong information architecture, investing in intelligent data fusion and creating a solid analytics foundation is vital to the success of future endeavours in data.
- On 19. juni 2021
The Future of AI and Sustainability
Artificial intelligence (AI) is powering the fourth industrial revolution. Intelligent machines are tackling new cognitive tasks at scale, leading to enormous ...
The Fourth Industrial Revolution
The fourth industrial revolution | WooJae Chung | TEDxISPP
Woo Jae talks about the fourth industrial revolution and some implications for medicine and the role of humans in society. WooJae is a student at ISPP.
The fourth industrial revolution is in your pocket | Ian Khan | TEDxMississauga
Ian Khan recounts how technology has changed our society over time. How are those little devices we carry in our pockets changing our world? Author of Cloud ...
Artificial Intelligence Can Change the future of Medical Diagnosis | Shinjini Kundu | TEDxPittsburgh
Medical diagnosis often is based on information visible from medical images. But what if machine learning and artificial intelligence could help our doctors gain ...
Innovation in the financial industry – Artificial Intelligence
AI is increasingly being touted as the next big thing in financial services. Will we see more technology experts innovating in the financial space? Will banks be ...
How A.I. Traders Will Dominate Hedge Fund Industry | Marshall Chang | TEDxBeaconStreetSalon
We've seen fully automated bot beats us in Go, one-on-one Poker and Dota II, now what's going to happen for trading financial markets? Listen to A.I. Capital ...
Robotics and Artificial Intelligence 4Manufacturing
Juergen Maier, Chief Executive of Siemens UK describes to #KTN and #Innovate UK the drivers of innovation within the 4th Industrial Revolution.
CIESF20171025 Manufacturing: The Future is Artificial Intelligence
About the Speaker: Anna-Katrina Shedletsky, CEO and cofounder of Instrumental Inc, discusses trends in manufacturing, industry 4.0 and artificial intelligence.
Keynote: Artificial Intelligence with Geo
Joseph Sirosh, Corporate Vice President in the Artificial Intelligence and Research group at Microsoft discusses how artificial intelligence combined with ...