AI News, BOOK REVIEW: Machine Learning Use Case artificial intelligence
Leveraging Machine Learning and Artificial Intelligence for 5G
The heterogenous nature of future wireless networks comprising of multiple access networks, frequency bands and cells - all with overlapping coverage areas - presents wireless operators with network planning and deployment challenges.
ML and AI can assist in finding the best beam by considering the instantaneous values updated at each UE measurement of the parameters mentioned below: Once the UE identifies the best beam, it can start the random-access procedure to connect to the beam using timing and angular information.
Massive simply refers to the large number of antennas (32 or more logical antenna ports) in the base station antenna array. Massive MIMO enhances user experience by significantly increasing throughput, network capacity and coverage while reducing interference by: The weights for antenna elements for a massive MIMO 5G cell site are critical for maximizing the beamforming effect.
ML and AI can collect real time information for multidimensional analysis and construct a panoramic data map of each network slice based on: Different aspects where ML and AI can be leveraged include: With future heterogenous wireless networks implemented with varied technologies addressing different use cases providing connectivity to millions of users simultaneously requiring customization per slice and per service, involving large amounts of KPIs to maintain, ML and AI will be an essential and required methodology to be adopted by wireless operators in near future.
All of them address low latency use cases where the sensing and processing of data is time sensitive. These use cases include self-driving autonomous vehicles, time-critical industry automation and remote healthcare. 5G offers ultra-reliable low latency which is 10 times faster than 4G. However, to achieve even lower latencies, to enable event-driven analysis, real-time processing and decision making, there is a need for a paradigm shift from the current centralized and virtualized cloud-based AI towards a distributed AI architecture where the decision-making intelligence is closer to the edge of 5G networks.
The 5G mm-wave small cells require deep dense fiber networks and the cable industry is ideally placed to backhaul these small cells because of its already laid out fiber infrastructure which penetrates deep into the access network close to the end-user premises.
It delivers specific answers to your queries while also serving up the entire document and supporting links, allowing your employees and customers to make informed decisions with confidence.
They can’t understand the nuances of phrases and acronyms in your industry and accurately search through your complex documents in a timely manner. Watson Discovery’s cloud search solves these challenges.
Artificial intelligence in healthcare
Artificial intelligence (AI) in healthcare is the use of complex algorithms and software to emulate human cognition in the analysis of complicated medical data.
What distinguishes AI technology from traditional technologies in health care is the ability to gain information, process it and give a well-defined output to the end-user.
AI algorithms behave differently from humans in two ways: (1) algorithms are literal: if you set a goal, the algorithm can't adjust itself and only understand what it has been told explicitly, (2) and algorithms are black boxes;
AI programs have been developed and applied to practices such as diagnosis processes, treatment protocol development, drug development, personalized medicine, and patient monitoring and care.
to support operational initiatives that increase cost saving, improve patient satisfaction, and satisfy their staffing and workforce needs.
that help healthcare managers improve business operations through increasing utilization, decreasing patient boarding, reducing length of stay and optimizing staffing levels.
During this time, there was a recognition by researchers and developers that AI systems in healthcare must be designed to accommodate the absence of perfect data and build on the expertise of physicians.
The ability to interpret imaging results with radiology may aid clinicians in detecting a minute change in an image that a clinician might accidentally miss.
A study at Stanford created an algorithm that could detect pneumonia at that specific site, in those patients involved, with a better average F1 metric (a statistical metric based on accuracy and recall), than the radiologists involved in that trial.
The emergence of AI technology in radiology is perceived as a threat by some specialists, as the technology can achieve improvements in certain statistical metrics in isolated cases, as opposed to specialists.
Recent advances have suggested the use of AI to describe and evaluate the outcome of maxillo-facial surgery or the assessment of cleft palate therapy in regard to facial attractiveness or age appearance.
In 2018, a paper published in the journal Annals of Oncology mentioned that skin cancer could be detected more accurately by an artificial intelligence system (which used a deep learning convolutional neural network) than by dermatologists.
On average, the human dermatologists accurately detected 86.6% of skin cancers from the images, compared to 95% for the CNN machine.
One study conducted by the Centerstone research institute found that predictive modeling of EHR data has achieved 70–72% accuracy in predicting individualized treatment response at baseline.
To address the difficulty of tracking all known or suspected drug-drug interactions, machine learning algorithms have been created to extract information on interacting drugs and their possible effects from medical literature.
Efforts were consolidated in 2013 in the DDIExtraction Challenge, in which a team of researchers at Carlos III University assembled a corpus of literature on drug-drug interactions to form a standardized test for such algorithms.
Other algorithms identify drug-drug interactions from patterns in user-generated content, especially electronic health records and/or adverse event reports.
The subsequent motive of large based health companies merging with other health companies, allow for greater health data accessibility.
A second project with the NHS involves analysis of medical images collected from NHS patients to develop computer vision algorithms to detect cancerous tissues.
Intel's venture capital arm Intel Capital recently invested in startup Lumiata which uses AI to identify at-risk patients and develop care options.
team associated with the University of Arizona and backed by BPU Holdings began collaborating on a practical tool to monitor anxiety and delirium in hospital patients, particularly those with Dementia.
The AI utilized in the new technology – Senior's Virtual Assistant – goes a step beyond and is programmed to simulate and understand human emotions (artificial emotional intelligence).
Doctors working on the project have suggested that in addition to judging emotional states, the application can be used to provide companionship to patients in the form of small talk, soothing music, and even lighting adjustments to control anxiety.
Virtual nursing assistants are predicted to become more common and these will use AI to answer patient's questions and help reduce unnecessary hospital visits.
Overall, as Quan-Haase (2018) says, technology “extends to the accomplishment of societal goals, including higher levels of security, better means of communication over time and space, improved health care, and increased autonomy” (p. 43).
While research on the use of AI in healthcare aims to validate its efficacy in improving patient outcomes before its broader adoption, its use may nonetheless introduce several new types of risk to patients and healthcare providers, such as algorithmic bias, Do not resuscitate implications, and other machine morality issues.
We already have some scientists who know artificial intelligence and machine learning, but we want complementary people who can look forward and see how this technology will evolve.”
As of November 2018, eight use cases are being benchmarked, including assessing breast cancer risk from histopathological imagery, guiding anti-venom selection from snake images, and diagnosing skin lesions.
don't miss updates and announcements!
Over two days, the Healthcare Track at the Ai4 2020 Conference brings together business leaders and data practitioners to facilitate the adoption of artificial intelligence and machine learning technologies.
Everyday Examples of Artificial Intelligence and Machine Learning
With all the excitement and hype about AI that’s “just around the corner”—self-driving cars, instant machine translation, etc.—it can be difficult to see how AI is affecting the lives of regular people from moment to moment. What are examples of artificial intelligence that you’re already using—right now?
You’ve also likely used AI on your way to work, communicating online with friends, searching on the web, and making online purchases. We distinguish between AI and machine learning (ML) throughout this article when appropriate.
According to a 2015 report by the Texas Transportation Institute at Texas A&M University, commute times in the US have been steadily climbing year-over-year, resulting in 42 hours of rush-hour traffic delay per commuter in 2014—more than a full work week per year, with an estimated $160 billion in lost productivity.
driving to a train station, riding the train to the optimal stop, and then walking or using a ride-share service from that stop to the final destination), not to mention the expected and the unexpected: construction;
Engineering Lead for Uber ATC Jeff Schneider discussed in an NPR interview how the company uses ML to predict rider demand to ensure that “surge pricing”(short periods of sharp price increases to decrease rider demand and increase driver supply) will soon no longer be necessary.
Glimpse into the future In the future, AI will shorten your commute even further via self-driving cars that result in up to 90% fewer accidents, more efficient ride sharing to reduce the number of cars on the road by up to 75%, and smart traffic lights that reduce wait times by 40% and overall travel time by 26% in a pilot study.
“filter out messages with the words ‘online pharmacy’ and ‘Nigerian prince’ that come from unknown addresses”) aren’t effective against spam, because spammers can quickly update their messages to work around them.
In a research paper titled, “The Learning Behind Gmail Priority Inbox”, Google outlines its machine learning approach and notes “a huge variation between user preferences for volume of important mail…Thus, we need some manual intervention from users to tune their threshold.
The researchers tested the effectiveness of Priority Inbox on Google employees and found that those with Priority Inbox “spent 6% less time reading email overall, and 13% less time reading unimportant email.” Glimpse into the future Can your inbox reply to emails for you?
Smart reply uses machine learning to automatically suggest three different brief (but customized) responses to answer the email. As of early 2016, 10% of mobile Inbox users’ emails were sent via smart reply.
A brute force search comparing every string of text to every other string of text in a document database will have a high accuracy, but be far too computationally expensive to use in practice. One MIT paper highlights the possibility of using machine learning to optimize this algorithm.
– Credit Decisions Whenever you apply for a loan or credit card, the financial institution must quickly determine whether to accept your application and if so, what specific terms (interest rate, credit line amount, etc.) to offer. FICO uses ML both in developing your FICO score, which most banks use to make credit decisions, and in determining the specific risk assessment for individual customers.
In early 2016, Wealthfront announced it was taking an AI-first approach, promising “an advice engine rooted in artificial intelligence and modern APIs, an engine that we believe will deliver more relevant and personalized advice than ever before.” While there is no data on the long-term performance of robo-advisors (Betterment was founded in 2008, Wealthfront in 2011), they will become the norm for regular people looking to invest their savings.
In a short video highlighting their AI research (below), Facebook discusses the use of artificial neural networks—ML algorithms that mimic the structure of the human brain—to power facial recognition software.
In June 2016, Facebook announced a new AI initiative: DeepText, a text understanding engine that, the company claims “can understand with near-human accuracy the textual content of several thousand posts per second, spanning more than 20 languages.” DeepText is used in Facebook Messenger to detect intent—for instance, by allowing you to hail an Uber from within the app when you message “I need a ride” but not when you say, “I like to ride donkeys.” DeepText is also used for automating the removal of spam, helping popular public figures sort through the millions of comments on their posts to see those most relevant, identify for sale posts automatically and extract relevant information, and identify and surface content in which you might be interested.
– Pinterest Pinterest uses computer vision, an application of AI where computers are taught to “see,” in order to automatically identify objects in images (or “pins”) and then recommend visually similar pins. Other applications of machine learning at Pinterest include spam prevention, search and discovery, ad performance and monetization, and email marketing.
– Instagram Instagram, which Facebook acquired in 2012, uses machine learning to identify the contextual meaning of emoji, which have been steadily replacing slang (for instance, a laughing emoji could replace “lol”).
This may seem like a trivial application of AI, but Instagram has seen a massive increase in emoji use among all demographics, and being able to interpret and analyze it at large scale via this emoji-to-text translation sets the basis for further analysis on how people use Instagram.
A few months later, it opened its messenger platform to developers, allowing anyone to build a chatbot and integrate Wit.ai’s bot training capability to more easily create conversational bots.
–Recommendations You see recommendations for products you’re interested in as “customers who viewed this item also viewed” and “customers who bought this item also bought”, as well as via personalized recommendations on the home page, bottom of item pages, and through email.
While Amazon doesn’t reveal what proportion of its sales come from recommendations, research has shown that recommenders increase sales (in this linked study, by 5.9%, but in other studies recommenders have shown up to a 30% increase in sales) and that a product recommendation carries the same sales weight as a two-star increase in average rating (on a five-star scale).
Square, a credit card processor popular among small businesses, charges 2.75% for card-present transactions, compared to 3.5% + 15 cents for card-absent transactions.
By utilizing AI that can learn your purchasing habits, credit card processors minimize the probability of falsely declining your card while maximizing the probability of preventing somebody else from fraudulently charging it.
We may soon see retailers take it one step further and design your entire experience individually for you. Google already does this with search, even with users who are logged out, so this is well within the realm of possibility for retailers.
however, a month later Amazon’s press release boasted a 9x increase in Echo family sales over the previous year’s holiday sales, suggesting that 5 million sold is a significant underestimate.
For example, casual chess players regularly use AI powered chess engines to analyze their games and practice tactics, and bloggers often use mailing-list services that use ML to optimize reader engagement and open-rates.
- On 17. oktober 2021
AI in Healthcare: Real-World Machine Learning Use Cases
Levi Thatcher, PhD, VP of Data Science at Health Catalyst will share practical AI use cases and distill the lessons into a framework you can use when evaluating ...
Machine Learning and its Use Cases | HackerEarth Webinar | Machine Learning Case Study
HackerEarth is pleased to announce its next webinar on Machine Learning, to help you learn from the best programmers and domain experts from all over the ...
Top 10 Applications of Machine Learning | Machine Learning Application Examples | Edureka
Machine Learning Training with Python: ** This "Top 10 Applications of Machine Learning" ..
Machine Learning at Uber (Natural Language Processing Use Cases)
At Uber, we are using natural language processing and conversational AI to improve the user experience. In my talk I will be delving into 2 use cases. In the first ...
How AI is changing Business: A look at the limitless potential of AI | ANIRUDH KALA | TEDxIITBHU
Now a household name in the Indian computer science scene, Anirudh Kala offers us a sneak peek into the mind-boggling commercial potential of Artificial ...
How to Solve Car Parking with Artificial Intelligence | Use Case | Blockchain AI | Fetch.ai
Lead research scientist Marcin Abram explains how Fetch.AI agents detect the availability of parking spaces, enabling drivers to reserve a place to park before ...
Top 10 Applications Of Artificial Intelligence | Artificial Intelligence Applications | Edureka
Machine Learning Masters Program: ** This Edureka session on Applications Of ..
Practical Use Cases for AI & Machine Learning in Healthcare Organizations
Practical Use Cases for AI & Machine Learning in Healthcare Organizations Artificial intelligence (AI) and machine learning (ML) are effective tools for managing ...
WHAT'S NEXT. | Artificial Intelligence - Use Cases
Marijn Markus and Ron Tolido talk about use cases for AI. Missed the first video of the WHAT'S NEXT. Artificial Intelligence series? ➜
Machine Learning in Ui Path Rpa || AI apps in UI Path tutorials
This video explains how to develop a machine learning applications in uipath . I have developed a language detector application using aliyen Api's. This is an ...