AI News, 6 Top Big Data and Data Science Trends 2017

6 Top Big Data and Data Science Trends 2017

Recently we stepped in the 2017 year, and it’s time to draw the conclusion about 2016.

This process is driven by increased collaboration and flexibility, as well as reducing the complexity of administration and configuration of computing resources.And majority of the top cloud providers developed their own offering of Machine Learning services in a cloud.

This step allows organizations to leverage machine learning technology, without massive investments and needs to employ large data science teams.

Here are main examples of such machine learning and AI as a service (MLaaS and AIaaS) providers:   Those working with the data know very well that data is useless if it is not efficiently analyzed and turned into insights, which is, in fact, support decision-making process.

According to recent studies, the percent of users using Spark on the public cloud (61%) was higher than the percent using Hadoop YARN (36%) and this trend will continue in 2017.

One of the recent trends in security is increased usage of machine learning algorithm, including deep learning for detection of anomalies and other fields of data science security in various business domains.

In the future, there may occur a lot of new types of attacks, and thus the requirements for cyber security are getting more complicated, and security specialists will need to adapt to the new threats.

Deep learning gets a lot of attention in 2016, as many noticeable results were achieved by using it for many important applications, such as machine translation and other forms of language processing, Automatic Image Caption Generation, Object Classification and Detection in Images,  Facial Recognition and Automatic Game Playing.

Despite the fact that the bots have been existing for a long time, only now the AI development has reached a level where it became possible to create some advanced products, many of which utilize machine learning.

Some of the prominent examples of conversational AI we can see in such products as Google Assistant and Siri for iOS, which have become an almost indispensable product for users of smartphones and has already gone far beyond just a fun application or quirks users.

This year is going to be “The Year of Intelligence” as we see that AI and machine learning applications are going mainstream and contributes to every part of organization and business areas and becoming one of the key competitive advantages for companies which integrate machine learning into its operations.

We are not pretending this to be an ultimate list, as so many things are evolving quickly in technology realm, so we encourage you to share your vision about main trends for data and data science field in the comments section below.

Roundup Of Machine Learning Forecasts And Market Estimates, 2018

These and many other fascinating insights are from the latest series of machine learning market forecasts, market estimates, and projections.

Machine learning’s potential impact across many of the world’s most data-prolific industries continues to fuel venture capital investment, private equity (PE) funding, mergers, and acquisitions all focused on winning the race of Intellectual Property (IP) and patents in this field.

Deloitte Global is predicting up to 800K machine learning chips will be in use across global data centers this year.

And while the methodologies all vary across the many sources of forecasts, market estimates, and projections, all reflect how machine learning is improving the acuity and insights of companies on how to grow faster and more profitably.

Key takeaways from the collection of machine learning market forecasts, market estimates and projections include the following: Sources of Market Data on Machine Learning: 2018 Outlook: Machine Learning and Artificial Intelligence, A Survey of 1,600+ Data Professionals.

November 27, 2017 (110 pp., PDF, no opt-in) Artificial intelligence and machine learning in financial services Market developments and financial stability implications, Financial Stability Board.

Gartner’s Top 10 Strategic Technology Trends for 2017

AI and machine learning have reached a critical tipping point and will increasingly augment and extend virtually every technology enabled service, thing or application.  Creating intelligent systems that learn, adapt and potentially act autonomously rather than simply execute predefined instructions is primary battleground for technology vendors through at least 2020.

However, intelligent apps are not limited to new digital assistants – every existing software category from security tooling to enterprise applications such as marketing or ERP will be infused with AI enabled capabilities.  Using AI, technology providers will focus on three areas — advanced analytics, AI-powered and increasingly autonomous business processes and AI-powered immersive, conversational and continuous interfaces.

The lines between the digital and physical world continue to blur creating new opportunities for digital businesses.  Look for the digital world to be an increasingly detailed reflection of the physical world and the digital world to appear as part of the physical world creating fertile ground for new business models and digitally enabled ecosystems.

AR, which enables a blending of the real and virtual worlds, means businesses can overlay graphics onto real-world objects, such as hidden wires on the image of a wall.  Immersive experiences with AR and VR are reaching tipping points in terms of price and capability but will not replace other interface models.  Over time AR and VR expand beyond visual immersion to include all human senses.  Enterprises should look for targeted applications of VR and AR through 2020.

Their proliferation will require a cultural change, as those who understand the maintenance of real-world things collaborate with data scientists and IT professionals.  Digital twins of physical assets combined with digital representations of facilities and environments as well as people, businesses and processes will enable an increasingly detailed digital representation of the real world for simulation, analysis and control.

Read More: Gartner Top 10 Strategic Technology Trends for 2018 Blockchain is a type of distributed ledger in which value exchange transactions (in bitcoin or other token) are sequentially grouped into blocks.  Blockchain and distributed-ledger concepts are gaining traction because they hold the promise of transforming industry operating models in industries such as music distribution, identify verification and title registry.

 They promise a model to add trust to untrusted environments and reduce business friction by providing transparent access to the information in the chain.  While there is a great deal of interest the majority of blockchain initiatives are in alpha or beta phases and significant technology challenges exist.

The mesh app and service architecture (MASA) is a multichannel solution architecture that leverages cloud and serverless computing, containers and microservices as well as APIs and events to deliver modular, flexible and dynamic solutions.  Solutions ultimately support multiple users in multiple roles using multiple devices and communicating over multiple networks.

Machine Learning

Supervised learning algorithms are trained using labeled examples, such as an input where the desired output is known.

The learning algorithm receives a set of inputs along with the corresponding correct outputs, and the algorithm learns by comparing its actual output with correct outputs to find errors.

Through methods like classification, regression, prediction and gradient boosting, supervised learning uses patterns to predict the values of the label on additional unlabeled data.

Popular techniques include self-organizing maps, nearest-neighbor mapping, k-means clustering and singular value decomposition.

Machine Learning and Data Are Fueling a New Kind of Car

Intel’s proposed $15.3 billion acquisition of Mobileye, an Israeli company that supplies carmakers with a computer-vision technology and advanced driver assistance systems, offers a chance to measure the scale of this rebuild.

While the price tag might seem steep, especially with so many players in automated driving today, Mobileye has some key technological strengths and strategic advantages.

Mobileye uses a single camera, together with a proprietary computer chip and some clever software, to provide various advanced driver assistance features.

David Keith, a professor at MIT’s Sloan School of Management who studies technology adoption in the automotive industry, says besides offering a simple, low-cost solution, Mobileye has amassed a huge amount of data—something that is vital to the machine learning that underpins automated driving today.

Meanwhile, Intel has seen its position of dominance eroded in recent years as desktop and laptop computers fade in importance, and as different types of computer chips have become more popular.

Keith adds that Intel will aim to use its hardware expertise to develop the increasingly sophisticated fusion systems—combining cameras, radar, and possibly laser sensing, or lidar—needed bring fully automated vehicles to market.

If your car is capable of identifying a road sign or a pedestrian on the road ahead, there’s a good chance it already uses one of Mobileye’s chips for the task.

They explained how Mobileye is now using reinforcement learning, a technique inspired by the way animals learn through experience, to teach computers how to drive safely in complex and subtle situations (see “10 Breakthrough Technologies 2017: Reinforcement Learning”).

Machine learning and the five vectors of progress

Though nearly every industry is finding applications for machine learning—the artificial intelligence technology that feeds on data to automatically discover patterns and anomalies and make predictions—most companies are not yet taking advantage.

The reality is that as much as 80 percent of the work on which data scientists spend their time can be fully or partially automated.12 This work might include data wrangling—preprocessing and normalizing data, filling in missing values, for instance, or determining whether to interpret the data in a column as a number or a date;

growing number of tools and techniques for data science automation, some offered by established companies and others by venture-backed start-ups, can help reduce the time required to execute a machine learning proof of concept from months to days.14 And automating data science means augmenting data scientists’ productivity, so even in the face of severe talent shortages, enterprises that employ data science automation technologies should be able to significantly expand their machine learning activities.

But now, semiconductor and computer manufacturers—both established companies and start-ups—are developing specialized processors such as graphics processing units (GPUs), field-programmable gate arrays, and application-specific integrated circuit to slash the time required to train machine learning models by accelerating the calculations and by speeding the transfer of data within the chip.

Had the team used only CPUs instead, according to one of the researchers, it would have taken five years.20 Google stated that its own AI chip, the Tensor Processing Unit (TPU), incorporated into a computing system that also includes CPUs and GPUs, provided such a performance boost that it helped avoid the cost of building of a dozen extra data centers.21 Early adopters of these specialized AI chips include major technology vendors and research institutions in data science and machine learning, but adoption is spreading to sectors such as retail, financial services, and telecom.

For example, the US banking industry adheres to SR 11-7, guidance published by the Federal Reserve, which among other things requires that model behavior be explained.23 But techniques are emerging that help shine light inside the black box of certain machine learning models, making them more interpretable and accurate.

MIT researchers, for instance, have demonstrated a method of training a neural network that delivers both accurate predictions and the rationales for those predictions.24 Some of these techniques are already appearing in commercial data science products.25 As it becomes possible to build interpretable machine learning models, companies in highly regulated industries such as financial services, life sciences, and health care will find attractive opportunities to use machine learning.

Advances in both software and hardware are making it increasingly viable to use the technology on mobile devices and smart sensors.27 On the software side, technology vendors such as Apple Inc., Facebook, Google, and Microsoft are creating compact machine learning models that require relatively little memory but can still handle tasks such as image recognition and language translation on mobile devices.28 Microsoft Research Lab’s compression efforts resulted in models that were 10 to 100 times smaller.29 On the hardware end, semiconductor vendors such as Intel, Nvidia, and Qualcomm, as well as Google and Microsoft, have developed or are developing their own power-efficient AI chips to bring machine learning to mobile devices.30 The emergence of mobile devices as a machine learning platform is expanding the number of potential applications of the technology and inducing companies to develop applications in areas such as smart homes and cities, autonomous vehicles, wearable technology, and the industrial Internet of Things.

What is machine learning and how to learn it ?

Machine learning is just to give trained data to a program and get better result for complex problems. It is very close to data ..

Why Machine Learning is The Future? | Sundar Pichai Talks About Machine Learning

Why Machine Learning is The Future? | Sundar Pichai Talks About Machine Learning ...

Sundar Pichai: How machine learning & deep learning improved technologies

Machine learning is a type of artificial intelligence (AI) that provides computers with the ability to learn without being explicitly programmed. Machine learning ...

10 Machine Learning based Products You MUST See

Cozmo - NVIDIA AI Car - Moley Robot - Sawyer Robot

Best Laptop for Machine Learning

What kind of laptop should you get if you want to do machine learning? There are a lot of options out there and in this video i'll describe the components of an ...

Machine Learning APIs by Example (Google I/O '17)

Find out how you can make use of Google's machine learning expertise to power your applications. Google Cloud Platform (GCP) offers five APIs that provide ...

Transform Retail with Machine Learning: Find & Recommend products (Google Cloud Next '17)

Businesses today are realizing that they can use machine learning to improve customer experience. With the most recent models, you can simplify product ...

Data Science, Machine Learning and AI - Learn Differences and Career Options

On-device machine learning: TensorFlow on Android (Google Cloud Next '17)

In this video, Yufeng Guo applies deep learning models to local prediction on mobile devices. Yufeng shows you how to use TensorFlow to implement a ...

Artificial Intelligence Vs Machine Learning Vs Data science Vs Deep learning

For More information Please visit