Top 14 Machine Learning Tools for Data Scientists

IBM, Google, Microsoft, Amazon and several companies are offering Machine Learning tools and API’s as part of their cloud offering. Last year, Amazon announced open-sourcing its deep learning library, Deep Scalable Sparse Tensor Network Engine, (DSSTNE), which is now available on GitHub. Followed by Google, it opened its SyntaxNet neural network framework for developers to build applications that can process human language. In Oct 2016, IBM renamed its Predictive Analytics Services as ‘Watson Machine Learning’. The focus is to provide deeper and more sophisticated self-learning capabilities as well as enhanced model management and deployment functionality within the service.

Machine learning has taken a main stage in many advanced analytical and predictive models. If you are user of Amazon.com or Netflix, you are consuming Machine learning recommendations. Similarly most of the websites customize and personalize to your taste, behavior and style based on machine learning that learns previous patterns and enables cognitive system to learn, reason and engaged with you in natural and personalized way.

Machine Learning has entered into every industry from Retail, Manufacturing, Banking, Cable, Games and sports and Media/Entertainment to many more. Today’s machine learning uses analytic models and algorithms that iteratively learn from data, thus allowing computers to find hidden insights without being explicitly programmed where to look. This means data analysts and scientists can teach computers to solve problems without having to recode rules each time a new data set is presented. Using algorithms that learn by looking at hundreds or thousands of data samples, computers can make predictions based on these learned experiences to solve the same problem in new situations. And they’re doing it with a level of accuracy that is beginning to mimic human intelligence.

Here are some of the top Machine learning tools

  1. IBM Machine Learning – https://console.ng.bluemix.net/catalog/services/ibm-watsonmachinelearning/
  2. Microsoft Azure Machine Learning – https://azure.microsoft.com/en-us/services/machinelearning/
  3. Google Machine Learning – https://cloud.google.com/products/machinelearning/
  4. Amazon Machine Learning – https://aws.amazon.com/machinelearning/
  5. Scikit-learn – https://github.com/scikit-learn/scikit-learn
  6. Shogun – https://github.com/shogun-toolbox/shogun
  7. Mahout – https://mahout.apache.org/
  8. Apache Spark MLlib – https://spark.apache.org/mllib/
  9. Weka – http://www.cs.waikato.ac.nz/ml/weka/
  10. Cloudera – http://www.cloudera.com/training/courses/intro-machine-learning.html
  11. BigML – https://bigml.com/
  12. TensorFlow – https://www.tensorflow.org/
  13. H20 – http://www.h2o.ai/
  14. Veles – https://velesnet.ml/

Here’s the last seen Machine Intelligence Landscape published in early 2016 by Shivonzillis.com

Tell us about your ML experience with any of the tools above and any new tools that you want to share with other readers!

Advertisements

AI vs Machine Learning vs. Deep Learning

Artificial Intelligence, Machine Learning and Deep Learning was very much of the hype words in 2016 and going in 2017. These techniques have existed for decades but the application to business world has recently been explored in main stream. Machine learning and Deep learning are both fundamentally a form of Artificial intelligence. While the concepts of Machine learning and Deep learning have been around as early as from 1950s, they both have evolved and separated from each other in every decade with technology and experiments.

As per Stanford University, AI is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable.

So where exactly did AI start? Hmmm…After WWII, a number of people independently started to work on intelligent machines. The English mathematician Alan Turing may have been the first. He gave a lecture on it in 1947. He also may have been the first to decide that AI was best researched by programming computers rather than by building machines. By the late 1950s, there were many researchers on AI, and most of them were basing their work on programming computers.

AI has several branches. Some of them are – Search, Pattern Recognition, Logical AI, Heuristics, Genetic Programming, Epistemology, Ontology. Few of the common applications are Games, Speech Recognition, Vision, Expert Systems, Natural Language processing, Heuristic classification,

Machine Learning (ML) is a form of AI that facilitates a computer’s ability to learn and essentially teach itself to evolve as it becomes exposed to new and ever-changing data. The main components of ML software are statistical analysis and predictive analysis. These algorithms can spot patterns and find hidden insights based on observed data from previous computations without being programmed on where to look. They learn from every experience and interaction.

If you use Netflix, you will notice that the suggested movies change and personalize to your taste based on previous movie views. While machine learning has become an integral part of processing data, one of the main differences when compared to deep learning is that it requires manual intervention in selecting which features to process, whereas deep learning does it intuitively.

Some of the ML techniques are Linear Regression, K-means, Decision Trees, Random Forest, PCA, SVM and finally Artificial Neural Networks (ANN). Artificial Neural Networks is where the field of Deep Learning had its genesis from.

Deep Learning (DL) is an advanced, sophisticated branch of AI with predictive capabilities that is inspired by the brain’s ability to learn. Andrew Ng from Coursera and Chief Scientist at Baidu Research formally founded Google Brain that eventually resulted in the productization of deep learning technologies across a large number of Google services.Just as the human brain can identify an object in milliseconds, deep learning can mirror this instinct with nearly the same speed and precision. Deep learning has the nimble ability to assess an object, properly digest the information and adapt to different variants.

Deep Learning is used by Google in its voice and image recognition algorithms, by Netflix and Amazon to decide what you want to watch or buy next, and by researchers at MIT to predict the future. Extending deep learning into applications beyond speech and image recognition will require more conceptual and software breakthroughs, not to mention many more advances in processing power.

Here is Gartner’s August 2016 Hype Cycle and Deep Learning isn’t even mentioned on the slide:

Google had two deep-learning projects underway in 2012. Today it is pursuing more than 1,000, according to a spokesperson, in all its major product sectors, including search, Android, Gmail, translation, maps, YouTube, and self-driving cars. IBM’s Watson system used AI, but not deep learning, when it beat two Jeopardy champions in 2011. Now, though, almost all of Watson’s 30 component services have been augmented by deep learning. most of the deep-learning applications that have been commercially deployed so far involve companies like Google, Microsoft, Facebook, Baidu, and Amazon—the companies with the vast stores of data needed for deep-learning computations.

One of the startup consulting company 7F Consulting promises to leverage Machine Learning and Deep Learning in its Strategy Advisory practice for advising C-Suite on strategy decisions. Their framework is called ‘Rananiti’ – meaning War strategy in Sanskrit! Rananiti uses proprietary solution and framework that provides a significant access to market insights with probabilistic decision advising executives on critical business decisions!

Now, other companies have started experimenting and integrating deep learning into their own day-to-day processes. Are you the next one?

AI to the rescue of C-Suite

Artificial Intelligence (AI) has been in discussion for past several decades as science fiction and experiments in research and labs. However, in past 2-3 years, AI has come in the forefront of discussion. It has started gaining adaption in C-Suite and business world. Many industries – Healthcare, Retail, Banking, Consumer Products, Insurance, Manufacturing, Auto among several others are now experimenting with AI tools and technologies.

A 2015 Tech Pro Research survey indicated that 24 percent of businesses across industries are currently using AI or had plans to do so within the year. There are three forms of AI – Assisted, Augmented and Autonomous. You can see these scenarios in auto industry that is taking cars using Assisted AI to Augmented AI with a goal to have Autonomous AI driven cars in next few months or a year.

Now, how do we apply the AI in business strategy and decision making process. Despite several advancements in Cognitive technologies in past year, having a autonomous AI for business strategy is far from reality. However, one of the leading start up management consulting firm 7F Consulting is claiming to have a solution framework to include all market news, industry regulations, economic indicators, social and competitive intelligence to become a AI Assisted strategy advisor. Imagine a solution like this that is integrated with Alexa or Siri and it can run hypothesis with decision tree for you and using the data, come back with probabilistic decision for you to consider. Over course of multiple interactions, the system learns what insights you need, what actions you take and what drives results using Management framework and delivers tailored results.

The potential of Artificial Intelligence for organizations is enormous and if the projections turn out to be true and in the coming years the AI market will grow to a multiple billion dollar market, doing business can take up a whole new meaning. Resulting in fewer employees required while significantly improving your bottom line results. We are all aware of how jobs that once were the exclusive domain of humans such as facial recognition, sarcastic comment analysis, automobile operation, and language translation are now being done with software. If your organization is not already doing so, you should encourage it to undertake pilot projects involving AI to gain experience and better understand its capabilities and, perhaps more important, its limitations.

Blockchain: Basics and Hacks

A Blockchain is a digital platform that hosts a digital ledger of transactions and shares it among a distributed network of computers. The cryptography technology allows each participant on the network to manipulate the ledger in a secure way without the need for a central authority. Once a block of data is recorded on the Blockchain ledger, it’s extremely difficult to change or remove. When someone wants to add to it, participants in the network — all of which have copies of the existing Blockchain — run algorithms to evaluate and verify the proposed transaction. If a majority of nodes agree that the transaction looks valid — that is, identifying information matches the Blockchain’s history — then the new transaction will be approved and a new block added to the chain.

blockchain

Image: Financial Times

A report from financial technology consultant Aite estimated that banks spent $75 million last year on blockchain. And Silicon Valley venture capitalists are also queuing up to back it.

estimated-blockchain

Bitcoin’s Blockchain is often touted as a revolutionary step forward for network security. But August’s theft of nearly $68 million of customers’ bitcoins from a Hong-Kong-based exchange demonstrated that the currency is still a big risk.

The very Fact that all Bitcoin transactions are permanent and cannot be undone, gives hackers a free hand to steal Bitcoins and get away with it. In fact, there are a few clever tricks built in Bitcoin System so that altering a ledger entry in the blockchain invalidates all subsequent entries. So It is Practically Impossible to Undo Payments in this case “Stolen Bitcoins” unless the hacker himself agrees to return the stolen Bitcoins. There are basically two ways a hacker could hack Bitcoin System for Stealing Bitcoins. He is either able to get the Blockchain password (Wallet key) of a User or a group of user and then use it to transfer all bitcoins from users wallet to his Anonymous Wallet. Or he could actually Hijack Bitcoin Mining Pool and redirect all of its computing power to Mine Bitcoins for himself.

Kaspersky Labs and INTERPOL have presented research in which they show how blockchain-based cryptocurrencies can potentially be abused with arbitrary data that can be disseminated through its public decentralized databases. An attack on “The DAO” took place on 17th June 2016. However, believe it or not, the developers did know of the vulnerability before that date (12th of June). One of the DAO’s creators, Stephan Tual, released a blog post where he explained that even though a recursive call bug exists in a similar smart contract framework (MakerDAO), the DAO is not at risk. Whilst the developers were postponing the fix, the network was compromised and 3.6 million ETH (approximate $53 million at the time) were drained from the DAO. To put it into perspective, this was a third of its resources. Security issues will likely always be present in the Bitcoin world, and users will have to rely on cybersecurity firms to constantly innovate and provide solutions.

From IBM’s perspective, industrial-grade blockchain technologies have the following characteristics:

  • A shared, permissioned ledger is the append-only system of record (SOR) and single source of truth. It is visible to all participating members of the business network.
  • A consensus protocol agreed to by all participating members of the business network ensures that the ledger is updated only with network-verified transactions.
  • Crytography ensures tamper-proof security, authentication, and integrity of transactions.
  • Smart contracts encapsulate participant terms of agreements for the business that takes place on the network; they are stored on the validating nodes in the blockchain and triggered by transactions

IBM is a premier code-contributing member of the Hyperledger Project, which is the Linux Foundation’s open source collaborative effort to create a blockchain for business-to-business (B2B) and business-to-customer (B2C) transactions. IBM has contributed 44,000 lines of blockchain code to the Hyperledger Project. IBM’s contributed code helps developers explore the use of blockchain in the enterprise as they build secure decentralized ledgers to exchange assets of value between participants. IBM’s proposed contribution is a “low-level blockchain fabric that has been designed to meet the requirements of a variety of industry-focused use cases. It extends the learning of the pioneers in this field by addressing additional requirements needed to satisfy those broader industry use cases.

Cognitive 101

This era will redefine the relationship between man and machine – Ginni Rometty – IBM CEO

Cognitive computing is one of the most exciting developments in software technology in the past few years. Conceptually, cognitive computing focuses on enabling software models that simulate the human thought process. Cognitive computing offers fundamental differences in how systems are built and interact with humans. Cognitive-based systems, such as IBM Watson, are able to build knowledge and learn, understand natural language, and reason and interact more naturally with human beings than traditional systems. They are also able to put content into context with confidence-weighted responses and supporting evidence. More specifically, cognitive computing enables capabilities that simulate functions of the human brain such as voice, speech, and vision analysis. From this perspective, cognitive computing is becoming an essential element to enable the next wave of data intelligence for mobile and IoT solutions. Text, vision, and speech are common sources of data used by mobile and IoT solutions. Cognitive systems can quickly identify new patterns and insights. Over time, they will simulate even more closely how the brain actually works. In doing so, they could help us solve the world’s most complex problems by penetrating the complexity of big data and exploiting the power of natural language processing and machine learning.

IBM points to a survey of more than 5,000 C-suite executives by its Institute for Business Value (IBV), which found the following:

  • Sixty-five percent of insurance industry CXOs are pursuing some form of business model innovation, but nearly 30 percent feel the quality, accuracy and completeness of their data is insufficient.
  • Sixty percent of retail executives do not believe their company is equipped to deliver the level of individual experiences consumers demand, and 95 percent say they will invest in cognitive in the next five years.
  • The healthcare industry forecasts a 13 million person gap in qualified healthcare workers by 2035, and more than half of healthcare industry CXOs report that current constraints on their ability to use all available information limits their confidence about making strategic business decisions. Eighty-four percent of these leaders believe cognitive will be a disruptive force in healthcare and 95 percent plan to invest in it over the next five years.

The story was similar across all industries: executives surveyed by the IBV cited scarcity of skills and technical expertise — rather than security, privacy or maturity of the technology — as the primary barriers to cognitive adoption.

The most popular cognitive computing platform in the market, IBM Watson provides a diverse number of APIs to enable capabilities such as vision, speech, text, and data analysis. Watson is now available to developers as part of the Watson developer cloud included in Bluemix distributions.

 

Good resource to learn more about Cognitive Computing, visit IBM:

http://www.research.ibm.com/cognitive-computing/index.shtml

http://www.ibm.com/cognitive/

Cognitive Internet of Things?

Internet of Things (IoT) represents the extension and evolution of the Internet, which has great potential and prospects for modern intelligent service and applications. However the current IoT is still based on traditional static architectures and models by our deep investigation. It lacks enough intelligence and cannot comply with the increasing application performance requirements. By integrating cognition into IoT, we present a new concept of Cognitive Internet of Things (CIoT) and its corresponding intelligent architecture.

Most of the current offerings from several point solution vendors for Internet of Things (IoT) focusses on how to connect devices to see, hear, smell the physical world around and report the observations. However, I would argue that only connectivity and reporting is not enough but capability to learn, think and understand both physical, social and contextual data and apply intelligence is the key. This requirement drives us to develop a new model called “Cognitive” Internet of Things. What is Cognitive? It is more appropriate to refer to “cognition” as an “integrative field” rather than a “discipline” since the study on “cognition” integrates many fields that are rooted in neuroscience, cognitive science, computer science, mathematics, physics, and engineering, etc.

Cognitive computing is one of the most exciting developments in software technology in the past few years. Conceptually, cognitive computing focuses on enabling software models that simulate the human thought process. More specifically, cognitive computing enables capabilities that simulate functions of the human brain such as voice, speech, and vision analysis. From this perspective, cognitive computing is becoming an essential element to enable the next wave of data intelligence for mobile and IoT solutions. Text, vision, and speech are common sources of data used by mobile and IoT solutions.

As per IEEE, Cognitive Internet of Things is a new network paradigm, where (physical/virtual) things or objects are interconnected and behave as agents, with minimum human intervention, the things interact with each other following a context-aware perception-action cycle, use the methodology of understanding-by-building to learn from both the physical environment and social networks, store the learned semantic and/or knowledge in kinds of databases, and adapt themselves to changes or uncertainties via resource-efficient decision-making mechanisms, with two primary objectives in mind:

  • bridging the physical world (with objects, resources, etc) and the social world (with human demand, social behavior, etc), together with themselves to form an intelligent physical-cyber-social (iPCS) system;
  • enabling smart resource allocation, automatic network operation, and intelligent service provisioning

The development of IoT depends on dynamic technical innovations in a number of fields, from wireless sensors to nanotechnology. Without comprehensive cognitive capability, IoT is just like an awkward stegosaurus: all muscle, no brains. To fulfill its potential and deal with growing challenges, we must take the cognitive capability into consideration and empower IoT with high-level intelligence.

Blockchain for Dummies!

Many companies are accepting bitcoins, many are not. Here is a list. These include Target, Tesla, Whole Foods, Microsoft, Home Depot, Intuit, Dell, PayPal/EBay, Sears, Bloomberg.com and many others.  With many companies accepting the change and others getting ready to, bitcoins are an extremely fast-spreading currency. The crypto-currencies have multiplied in the market place in recent years. QR codes are the biggest help in real-world bitcoin transfers. Using a smartphone and a Bitcoin wallet app, a user scans a label and presses a small buttoned aptly named “spend.”

Every transaction that happens between a buyer and seller or a transferor and transferee or between 2 members on the network, is verified and validated by “miners” to ensure it is secured and there is no risk of double spending. These miners are similar to VISA or MasterCard or Amex of the credit card world that provides a platform to exchange, validate and authorize. The miner creates a block of records which holds a copied record of all the verified transactions that have occurred in the network over the past ‘n’ minutes. Each transaction in every block is made at specific time and linked to previous block of transactions. Digital records are lumped together into “blocks” then bound together cryptographically and chronologically into a “chain” using complex mathematical algorithms. This encryption process, known as “hashing” is carried out by lots of different computers. If they all agree on the answer, each block receives a unique digital signature. The groups/chains of these blocks of transactions is referred to as Blockchain. The Blockchain is seen as the main technological innovation of Bitcoin, since it stands as proof of all the transactions on the network. Blockchain, or distributed ledger, technology is more secure, transparent, faster and less expensive than current financial systems. The distributed nature of a Blockchain database means that it’s harder for hackers to attack it – they would have to get access to every copy of the database simultaneously to be successful. It also keeps data secure and private because the hash cannot be converted back into the original data – it’s a one-way process.

In short, Blockchain is a method of recording data – a digital ledger of transactions, agreements, contracts – anything that needs to be independently recorded and verified as having happened. The big difference is that this ledger isn’t stored in one place, it’s distributed across several, hundreds or even thousands of computers around the world. In 2015, some of the leading financial institutions such as Visa, Goldman Sachs, Citi and other Wall Street incumbents joined venture capital firms to pour $488 million into the industry. In a World Economic Forum report released in September, “Deep Shift: Technology Tipping Points and Societal Impacts,” 58% of survey respondents said that they expected that by the year 2025, 10% of global gross domestic product will be stored on Blockchain technology. If banks started sharing data using a tailor-made version of Blockchain it could remove the need for middlemen, a lot of manual processing, and speed up transactions. If banks and other financial institutions are able to speed up transactions and take costs out of the system, it should mean cheaper, more efficient services for us.

 

%d bloggers like this: