So what is Machine Learning!

Machine Learning (ML) is everywhere today! If you are using any of the apps or websites for Amazon, Google, Uber, Netflix, Waze, Facebook in your daily busy life, they are all internally supported by machine learning algorithms. Most of the top companies in every sector from Aviation to Oil and Gas, Banking to Retail, Ecommerce to Transportation have products and projects using machine learning tools and technologies. Machine learning has given us self-driving cars, practical speech recognition, effective web search, and a vastly improved understanding of the human genome. Machine learning is so pervasive today that you probably use it dozens of times a day without knowing it.

Wikipedia defines machine learning as “a subfield of computer science (CS) and artificial intelligence (AI) that deals with the construction and study of systems that can learn from data, rather than follow only explicitly programmed instructions.” As per IBM – Machine learning is the science of how computers make sense of data using algorithms and analytic models. As per SAS – Machine learning is a method of data analysis that automates analytical model building. Using algorithms that iteratively learn from data, machine learning allows computers to find hidden insights without being explicitly programmed where to look.

As per Stanford, Machine learning is the science of getting computers to act without being explicitly programmed. It enables cognitive systems to learn, reason and engage with us in a more natural and personalized way. The most powerful form of machine learning being used today, called “deep learning”, builds a complex mathematical structure called a neural network based on vast quantities of data. Designed to be analogous to how a human brain works, neural networks themselves were first described in the 1930s. But it’s only in the last three or four years that computers have become powerful enough to use them effectively. If deep learning will be as big as the internet, it’s time for everyone to start looking closely at it.

Machine Learning (ML) is a form of AI that facilitates a computer’s ability to learn and essentially teach itself to evolve as it becomes exposed to new and ever-changing data. The main components of ML software are statistical analysis and predictive analysis. These algorithms can spot patterns and find hidden insights based on observed data from previous computations without being programmed on where to look. They learn from every experience and interaction.

Some of the ML techniques are Linear Regression, K-means, Decision Trees, Random Forest, PCA, SVM and finally Artificial Neural Networks (ANN). Artificial Neural Networks is where the field of Deep Learning had its genesis from.

Internet of Things (IOT) will produce billions and billions of data points from billions of connected devices by 2020. Machine learning can help companies take the billions of data points they have and boil them down to what’s really meaningful. The future realization of IoT’s promise is dependent on machine learning to find the patterns, correlations and anomalies that have the potential of enabling improvements in almost every facet of our daily lives.

As per HBR, machine learning has tremendous potential to transform companies. Executives who want to get the most out of their companies’ data should understand what it is, what it can do, and what to watch out for when using it. It’s time to let the machines point out where the opportunities truly are! In our next blog, we will take a journey into the various use cases and implementation of ML in different industry.

Share with me your use cases of ML that you want to get included in my next blog!


Cognitive 101

This era will redefine the relationship between man and machine – Ginni Rometty – IBM CEO

Cognitive computing is one of the most exciting developments in software technology in the past few years. Conceptually, cognitive computing focuses on enabling software models that simulate the human thought process. Cognitive computing offers fundamental differences in how systems are built and interact with humans. Cognitive-based systems, such as IBM Watson, are able to build knowledge and learn, understand natural language, and reason and interact more naturally with human beings than traditional systems. They are also able to put content into context with confidence-weighted responses and supporting evidence. More specifically, cognitive computing enables capabilities that simulate functions of the human brain such as voice, speech, and vision analysis. From this perspective, cognitive computing is becoming an essential element to enable the next wave of data intelligence for mobile and IoT solutions. Text, vision, and speech are common sources of data used by mobile and IoT solutions. Cognitive systems can quickly identify new patterns and insights. Over time, they will simulate even more closely how the brain actually works. In doing so, they could help us solve the world’s most complex problems by penetrating the complexity of big data and exploiting the power of natural language processing and machine learning.

IBM points to a survey of more than 5,000 C-suite executives by its Institute for Business Value (IBV), which found the following:

  • Sixty-five percent of insurance industry CXOs are pursuing some form of business model innovation, but nearly 30 percent feel the quality, accuracy and completeness of their data is insufficient.
  • Sixty percent of retail executives do not believe their company is equipped to deliver the level of individual experiences consumers demand, and 95 percent say they will invest in cognitive in the next five years.
  • The healthcare industry forecasts a 13 million person gap in qualified healthcare workers by 2035, and more than half of healthcare industry CXOs report that current constraints on their ability to use all available information limits their confidence about making strategic business decisions. Eighty-four percent of these leaders believe cognitive will be a disruptive force in healthcare and 95 percent plan to invest in it over the next five years.

The story was similar across all industries: executives surveyed by the IBV cited scarcity of skills and technical expertise — rather than security, privacy or maturity of the technology — as the primary barriers to cognitive adoption.

The most popular cognitive computing platform in the market, IBM Watson provides a diverse number of APIs to enable capabilities such as vision, speech, text, and data analysis. Watson is now available to developers as part of the Watson developer cloud included in Bluemix distributions.


Good resource to learn more about Cognitive Computing, visit IBM:

The Big Data Institute – Top Ten 2016 Predictions

 customer experience predictions

Here are our predictions for 2016 that The Big Data Institute sees shaping your businesses:

1. Customer Digital Experience will take the center stage for companies competing to win mindshare and share of wallet. Majority of the customer transactions will be initiated through mobile platform – smart phones, tablets, phablets, wearable devices, etc

2. Analytics will become secret weapon for many companies with Big Data and Internet of Things projects on rapid rise. Data Management and Advanced Analytics will become more sophisticated but more business user friendly.

3. Privacy and Security will continue to be top priority for companies as consumers behavior and new laws will drive corporates towards compliance requirements.

4. Consumers will start monetizing their own data using various mediums.

5. Leadership ranks and roles will be transformed with primary corporate roles as – Chief Data Officer, Chief Analytics Officer and Chief Intelligence Officer becoming dominant roles driving IT and Business.

6. Artificial Intelligence, Robotics, Cognitive Computing will be at the top of Hype Curve as companies start exploring and piloting AI, cognitive solutions while Big Data and IOT move towards plateau.

7. M&A on the rise with several acquisitions in IOT, Digital/Mobile and Security solutions.

8. Cloud will become the new standard and will become first choice in many cases for platform.

9. Businesses making IT purchase decisions (solution, software, and project) will be on the rise. In some companies, IT may get decentralized to address the speed and agility requirements.

10. Industry will move more towards pre-packaged/prebuilt solutions include point solutions building towards transformation project instead of large in-house builds.

Cognitive Computing: From hype and pilot to going mainstream implementation

Cognitive Computing started in late 1960’s with initial innovation around supercomputers. However the decision making capability, simulation and self learning intelligence was limited. With major advances in computing power, cognitive behavior science, computational intelligence, the researchers in Cognitive Computing made significant advancement. With advanced medical science and neuropschology, computer scientists were able to study the mechanics of human brain and that allowed the scientists to build computative models modeled after human mind. These models would allow association with various attributes and parameters from past experiences into cognitive systems. Cognitive computing and Decision Science was born with scientists developing computers that that operated at a higher rate of speed and accuracy than the human brain did.

As per IBM, Cognitive computing systems learn and interact naturally with people to extend what either humans or machine could do on their own. They help human experts make better decisions by penetrating the complexity of Big Data. Big Data growth is accelerating as more of the world’s activity is expressed digitally. Not only is it increasing in volume, but also in speed, variety and uncertainty. Most data now comes in unstructured forms such as video, images, symbols and natural language – a new computing model is needed in order for businesses to process and make sense of it, and enhance and extend the expertise of humans. Rather than being programmed to anticipate every possible answer or action needed to perform a function or set of tasks, cognitive computing systems are trained using artificial intelligence (AI) and machine learning algorithms to sense, predict, infer and, in some ways, think.

Cognitive computing reassess the nature of relationships between multiple variables and the environment factors. It can help assess the nature of the relationship between people and their increasingly pervasive today’s digital environment. They may play the role of assistant or coach for the user, or they may act virtually autonomously in many problem-solving situations. The boundaries of the processes and domains these systems will affect are still elastic and emergent. They must learn as attribures and variables changes, and as questions and decisions evolve. They must resolve ambiguity and tolerate unpredictability. They must be engineered to feed on dynamic data in real time, or near real time. They must interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices and cloud services, as well as with people. They must aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They must “remember” previous interactions in a process and return information that is suitable for the specific application at that point in time. They must understand, identify and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory or sensor-provided). The output from Cognitive systems may be prescriptive, suggestive, instructive or simply entertaining.

Cognitive informatics is a cutting-edge and multi-disciplinary research field that tackles the fundamental problems shared by modern informatics, computation, software engineering, AI, computational intelligence, cybernetics, cognitive science, neuropsychology, medical science, systems science, philosophy, linguistics, economics, management science, and life sciences. The development and the cross fertilization between the aforementioned science and engineering disciplines have led to a whole range of emerging research areas known as cognitive informatics.

Sample Cognitive Computing Use Cases

  1. Shopping: Cognitive computing systems’ ability to evaluate and generate hypotheses will help retail industries to find patterns, correlations and insights in mountains of unstructured and structured data. Watson’s app development platform is already moving into this physical-virtual space. The startup, Fluid, has layered Watson on top of its Expert Personal Shopper app for retail brands. Watson will be your personal shopping assistant. Store associates will also have similar intelligent tech providing them instant product information, customer loyalty data, sales histories, user reviews, blogs and magazines, so that when you do need to talk with another human, they know exactly how to help.
  2. Medical: evidence-based learning, hypothesis generation and natural-language skills could help medical professionals make key decisions in patient diagnosis and treatment. The objective is to give Doctors and Surgeons a quick way to access diagnostic and treatment options based on updated research.
  3. Banking: In fraud detection, financial institutions could have cognitive tools that enable them to go beyond analyses of cardholders’ credit transaction histories; cognitive computing might provide them with new “associational” intelligence, such as when an individual is most likely to make purchases, what he is likely to buy, and under what circumstances.
  4. Finance: Benefits from Cognitive Computing also will be seen by financial advisors, including individuals that handle their own portfolios, as the technology enables bringing together relevant, current and personalized information and remembering questions and preferences
  5. Weather Forecasting and Planning: Weather forecasting can benefit from Cognitive Computing and big data analytics. For instance, IBM’s Deep Thunder, a research project that creates precise, local weather forecasts, can predict severe storms in a specific area up to three days before the event. This early-warning system gives local authorities and residents enough time to make preparations

As per Deloitte University Press – Cognitive analytics can help address some key challenges. It can improve prediction accuracy, provide augmentation and scale to human cognition, and allow tasks to be performed more efficiently (and automatically) via context-based suggestions. For organizations that want to improve their ability to sense and respond, cognitive analytics offers a powerful way to bridge the gap between the promise of big data and the reality of practical decision making.

IBM has calculated that the market size for cognitive computing services in the private sector is in the neighborhood of $50 billion. At present, there are very few vendors in the field. While IBM has announced the creations of the Watson Group to commercialize cognitive computing and Google has acquired AI startup Deepmind, there are few companies in the space. Much of the work is still happening at a university level or within research organizations. Cognitive computing is still early from a commercialization perspective. It is likely another three to five years before an industry wide adoption and its impact on a wide range of companies. For a while at least, cognitive computing will fill a niche in industries where highly complex decision making is the norm, such as healthcare and financial markets. Eventually, it will become a normal tool in every corporate toolbox, helping to augment human decision making.

%d bloggers like this: