covid-has-exposed-the-limitations-of-machine-learning

Context: The U.S. Labor Department said that the economy unexpectedly added 2.5 million jobs in May 2020 which surprised many economists and analysts who had forecast millions more losing their jobs based on their Machine Learning (ML) models.

Background:

  • This isn’t the first time that the technology around ML has failed. 
  • In 2016, sophisticated ML algorithms failed to predict the outcomes of both the Brexit vote as well as the US presidential election.     

About Machine Learning:

  • It represents the idea that a computer, when fed with enough raw data, can begin on its own to see patterns and rules in these numbers. 
  • It can also learn to recognize, categorize and feed new data upon arrival into the patterns and rules already created by the computer program. 
  • As more data is received, it adds to the “intelligence" of the computer by making its patterns and rules ever more refined and reliable.

Rationale behind its popularity in recent years:

  • Massive computational power - It is now available at low cost and can be provisioned in the cloud very quickly. 
    • Improvements in Graphics Processing Units design (now with thousands of cores therefore ideally suited to parallel workloads) have increased the training speed of deep learning algorithms.
  • Big Data – There has been an explosion in the amount of data we all create, coupled with near limitless storage capacity. 
    • Large and diverse data sets provide better training material for the algorithms.
  • Algorithms - They are now better at finding patterns in the mountains of data, and AI and machine learning platforms from players such as Google, IBM, and Microsoft are making it much easier to develop applications. 
  • Investment – This is also growing fast especially with respect to machine learning and deep learning .

Utilities of Machine Learning:

  • Load forecasting : It could be used to forecast supply and demand in real time and optimize economic load dispatch. 
    • In the UK, Google’s DeepMind has teamed up with National Grid to predict supply and demand peaks and hopes to reduce national energy usage by 10%.
  • Predictive maintenance: It can be bolstered with drones for asset inspections, replacing time intensive and risky manual inspections. 
    • The drones are trained using deep learning algorithms to automatically identify defects and predict failures without interrupting operations.
  • Virtual Personal Assistants: Machine learning is an important part of these personal assistants as they collect and refine the information on the basis of your previous involvement with them. 
    • Later, this set of data is utilized to render results that are tailored to your preferences.
    • Siri, Alexa, Google Now are some of the popular examples of virtual personal assistants. 
  • Predictions while Commuting: We all have been using GPS navigation services, although correct estimation of congestion data requires GPS installation on every car however this is not the case. 
    • In such a scenario Machine learning helps to estimate the regions where congestion can be found on the basis of daily experiences.
  • Social Media Services :From personalizing your news feed to better ads targeting, social media platforms are utilizing machine learning for their own and user benefits.
    • People you may know suggestions and Face Recognition are some of the examples.
  • Online Fraud Detection - Machine learning is proving its potential to make cyberspace a secure place and tracking monetary frauds online is one of its examples. 
    • For example: Paypal is using ML for protection against money laundering. 
    • The company uses a set of tools that helps them to compare millions of transactions taking place and distinguish between legitimate or illegitimate transactions taking place between the buyers and sellers.

Concerns:

  • Despite the great advances in computing, it is still very difficult to teach computers both human context and basic common sense. 
    • The brute-force approach of Artificial Intelligence (AI) behemoths does not rely on well-codified rules based on common sense
    • It relies instead on the raw computing power of machines to sift thousands upon thousands of potential combinations before selecting the best answer using pattern-matching. 
    • These same algorithms have been guiding decisions made by businesses for a while now—especially strategic and other shifts in corporate direction based on consumer behaviour. 
    • In a world where corporations make binary choices (either path X or path Y, but not both), these algorithms still fall short.
  • The pandemic has exposed their insufficiency further. This is especially true with ML systems at e-commerce retailers that were initially programmed to make sense of our online behaviour. 
    • During the pandemic, our online behaviour has been volatile which was not rightly predicted by ML and retailers were trying to stock one item in one week and another item in the second week.
  • Another issue is of Stationary assumptions by ML softwares to predict the future. 
    • The paradox is that finding patterns and then using them to make useful predictions is what ML is all about in the first place. 
    • But static assumptions have meant that the data sets used to train ML models haven’t included anything more than elementary “worst case" information. 
    • They didn’t expect a pandemic.
  • Formation of Bias in ML software is another roadblock. 
    • The bias enters through the manner in which an ML solution is framed, the presence of “unknown unknowns" in data sets, and in how the data is prepared before it is fed into a computer. 
    • These are further compounded by an “echo chamber" that is created by finely-targeted algorithms that these companies use. 
    • This Chamber bombardes an overload of information that serves to reinforce what the algorithm thinks the searcher needs to know. 
    • For instance, if I search for a particular type of phone on an e-commerce site, future searches are likely to auto-complete with that phone showing up even before I key in my entire search string.

Conclusion:

The situation brought about by the covid pandemic is still volatile and fluid. The training data sets and the computer code they produce to adjust predictive ML algorithms are unequal to the volatility. 

They need constant manual supervision and tweaking so that they do not throw themselves and other sophisticated downstream automated processes out of gear. 

Key Terms:

Artificial intelligence: It is the broadest term, having been coined as early as 1955. It refers to the ability of machines to exhibit human-like intelligence.

  • AI encompasses several different technologies and systems of which machine learning is one. 
  • Others include natural language processing, computer vision, and speech recognition.

Machine learning : It refers to the practice of using algorithms to parse large volumes of data, learn from it, detect patterns, and make decisions or predictions based on these patterns.

Deep learning is a subset of machine learning. 

It is based on neural networks and is a technique for implementing machine learning that has recently proved highly successful. Again, it is dependent on massive datasets to “train” itself.


Source:

https://www.livemint.com/opinion/columns/covid-has-exposed-the-limitations-of-machine-learning-11591633809674.html

Image Source: Livemint