Top 14 Machine Learning Tools for Data Scientists

IBM, Google, Microsoft, Amazon and several companies are offering Machine Learning tools and API’s as part of their cloud offering. Last year, Amazon announced open-sourcing its deep learning library, Deep Scalable Sparse Tensor Network Engine, (DSSTNE), which is now available on GitHub. Followed by Google, it opened its SyntaxNet neural network framework for developers to build applications that can process human language. In Oct 2016, IBM renamed its Predictive Analytics Services as ‘Watson Machine Learning’. The focus is to provide deeper and more sophisticated self-learning capabilities as well as enhanced model management and deployment functionality within the service.

Machine learning has taken a main stage in many advanced analytical and predictive models. If you are user of Amazon.com or Netflix, you are consuming Machine learning recommendations. Similarly most of the websites customize and personalize to your taste, behavior and style based on machine learning that learns previous patterns and enables cognitive system to learn, reason and engaged with you in natural and personalized way.

Machine Learning has entered into every industry from Retail, Manufacturing, Banking, Cable, Games and sports and Media/Entertainment to many more. Today’s machine learning uses analytic models and algorithms that iteratively learn from data, thus allowing computers to find hidden insights without being explicitly programmed where to look. This means data analysts and scientists can teach computers to solve problems without having to recode rules each time a new data set is presented. Using algorithms that learn by looking at hundreds or thousands of data samples, computers can make predictions based on these learned experiences to solve the same problem in new situations. And they’re doing it with a level of accuracy that is beginning to mimic human intelligence.

Here are some of the top Machine learning tools

  1. IBM Machine Learning – https://console.ng.bluemix.net/catalog/services/ibm-watsonmachinelearning/
  2. Microsoft Azure Machine Learning – https://azure.microsoft.com/en-us/services/machinelearning/
  3. Google Machine Learning – https://cloud.google.com/products/machinelearning/
  4. Amazon Machine Learning – https://aws.amazon.com/machinelearning/
  5. Scikit-learn – https://github.com/scikit-learn/scikit-learn
  6. Shogun – https://github.com/shogun-toolbox/shogun
  7. Mahout – https://mahout.apache.org/
  8. Apache Spark MLlib – https://spark.apache.org/mllib/
  9. Weka – http://www.cs.waikato.ac.nz/ml/weka/
  10. Cloudera – http://www.cloudera.com/training/courses/intro-machine-learning.html
  11. BigML – https://bigml.com/
  12. TensorFlow – https://www.tensorflow.org/
  13. H20 – http://www.h2o.ai/
  14. Veles – https://velesnet.ml/

Here’s the last seen Machine Intelligence Landscape published in early 2016 by Shivonzillis.com

Tell us about your ML experience with any of the tools above and any new tools that you want to share with other readers!

AI vs Machine Learning vs. Deep Learning

Artificial Intelligence, Machine Learning and Deep Learning was very much of the hype words in 2016 and going in 2017. These techniques have existed for decades but the application to business world has recently been explored in main stream. Machine learning and Deep learning are both fundamentally a form of Artificial intelligence. While the concepts of Machine learning and Deep learning have been around as early as from 1950s, they both have evolved and separated from each other in every decade with technology and experiments.

As per Stanford University, AI is the science and engineering of making intelligent machines, especially intelligent computer programs. It is related to the similar task of using computers to understand human intelligence, but AI does not have to confine itself to methods that are biologically observable.

So where exactly did AI start? Hmmm…After WWII, a number of people independently started to work on intelligent machines. The English mathematician Alan Turing may have been the first. He gave a lecture on it in 1947. He also may have been the first to decide that AI was best researched by programming computers rather than by building machines. By the late 1950s, there were many researchers on AI, and most of them were basing their work on programming computers.

AI has several branches. Some of them are – Search, Pattern Recognition, Logical AI, Heuristics, Genetic Programming, Epistemology, Ontology. Few of the common applications are Games, Speech Recognition, Vision, Expert Systems, Natural Language processing, Heuristic classification,

Machine Learning (ML) is a form of AI that facilitates a computer’s ability to learn and essentially teach itself to evolve as it becomes exposed to new and ever-changing data. The main components of ML software are statistical analysis and predictive analysis. These algorithms can spot patterns and find hidden insights based on observed data from previous computations without being programmed on where to look. They learn from every experience and interaction.

If you use Netflix, you will notice that the suggested movies change and personalize to your taste based on previous movie views. While machine learning has become an integral part of processing data, one of the main differences when compared to deep learning is that it requires manual intervention in selecting which features to process, whereas deep learning does it intuitively.

Some of the ML techniques are Linear Regression, K-means, Decision Trees, Random Forest, PCA, SVM and finally Artificial Neural Networks (ANN). Artificial Neural Networks is where the field of Deep Learning had its genesis from.

Deep Learning (DL) is an advanced, sophisticated branch of AI with predictive capabilities that is inspired by the brain’s ability to learn. Andrew Ng from Coursera and Chief Scientist at Baidu Research formally founded Google Brain that eventually resulted in the productization of deep learning technologies across a large number of Google services.Just as the human brain can identify an object in milliseconds, deep learning can mirror this instinct with nearly the same speed and precision. Deep learning has the nimble ability to assess an object, properly digest the information and adapt to different variants.

Deep Learning is used by Google in its voice and image recognition algorithms, by Netflix and Amazon to decide what you want to watch or buy next, and by researchers at MIT to predict the future. Extending deep learning into applications beyond speech and image recognition will require more conceptual and software breakthroughs, not to mention many more advances in processing power.

Here is Gartner’s August 2016 Hype Cycle and Deep Learning isn’t even mentioned on the slide:

Google had two deep-learning projects underway in 2012. Today it is pursuing more than 1,000, according to a spokesperson, in all its major product sectors, including search, Android, Gmail, translation, maps, YouTube, and self-driving cars. IBM’s Watson system used AI, but not deep learning, when it beat two Jeopardy champions in 2011. Now, though, almost all of Watson’s 30 component services have been augmented by deep learning. most of the deep-learning applications that have been commercially deployed so far involve companies like Google, Microsoft, Facebook, Baidu, and Amazon—the companies with the vast stores of data needed for deep-learning computations.

One of the startup consulting company 7F Consulting promises to leverage Machine Learning and Deep Learning in its Strategy Advisory practice for advising C-Suite on strategy decisions. Their framework is called ‘Rananiti’ – meaning War strategy in Sanskrit! Rananiti uses proprietary solution and framework that provides a significant access to market insights with probabilistic decision advising executives on critical business decisions!

Now, other companies have started experimenting and integrating deep learning into their own day-to-day processes. Are you the next one?

AI to the rescue of C-Suite

Artificial Intelligence (AI) has been in discussion for past several decades as science fiction and experiments in research and labs. However, in past 2-3 years, AI has come in the forefront of discussion. It has started gaining adaption in C-Suite and business world. Many industries – Healthcare, Retail, Banking, Consumer Products, Insurance, Manufacturing, Auto among several others are now experimenting with AI tools and technologies.

A 2015 Tech Pro Research survey indicated that 24 percent of businesses across industries are currently using AI or had plans to do so within the year. There are three forms of AI – Assisted, Augmented and Autonomous. You can see these scenarios in auto industry that is taking cars using Assisted AI to Augmented AI with a goal to have Autonomous AI driven cars in next few months or a year.

Now, how do we apply the AI in business strategy and decision making process. Despite several advancements in Cognitive technologies in past year, having a autonomous AI for business strategy is far from reality. However, one of the leading start up management consulting firm 7F Consulting is claiming to have a solution framework to include all market news, industry regulations, economic indicators, social and competitive intelligence to become a AI Assisted strategy advisor. Imagine a solution like this that is integrated with Alexa or Siri and it can run hypothesis with decision tree for you and using the data, come back with probabilistic decision for you to consider. Over course of multiple interactions, the system learns what insights you need, what actions you take and what drives results using Management framework and delivers tailored results.

The potential of Artificial Intelligence for organizations is enormous and if the projections turn out to be true and in the coming years the AI market will grow to a multiple billion dollar market, doing business can take up a whole new meaning. Resulting in fewer employees required while significantly improving your bottom line results. We are all aware of how jobs that once were the exclusive domain of humans such as facial recognition, sarcastic comment analysis, automobile operation, and language translation are now being done with software. If your organization is not already doing so, you should encourage it to undertake pilot projects involving AI to gain experience and better understand its capabilities and, perhaps more important, its limitations.

Blockchain: Basics and Hacks

A Blockchain is a digital platform that hosts a digital ledger of transactions and shares it among a distributed network of computers. The cryptography technology allows each participant on the network to manipulate the ledger in a secure way without the need for a central authority. Once a block of data is recorded on the Blockchain ledger, it’s extremely difficult to change or remove. When someone wants to add to it, participants in the network — all of which have copies of the existing Blockchain — run algorithms to evaluate and verify the proposed transaction. If a majority of nodes agree that the transaction looks valid — that is, identifying information matches the Blockchain’s history — then the new transaction will be approved and a new block added to the chain.

blockchain

Image: Financial Times

A report from financial technology consultant Aite estimated that banks spent $75 million last year on blockchain. And Silicon Valley venture capitalists are also queuing up to back it.

estimated-blockchain

Bitcoin’s Blockchain is often touted as a revolutionary step forward for network security. But August’s theft of nearly $68 million of customers’ bitcoins from a Hong-Kong-based exchange demonstrated that the currency is still a big risk.

The very Fact that all Bitcoin transactions are permanent and cannot be undone, gives hackers a free hand to steal Bitcoins and get away with it. In fact, there are a few clever tricks built in Bitcoin System so that altering a ledger entry in the blockchain invalidates all subsequent entries. So It is Practically Impossible to Undo Payments in this case “Stolen Bitcoins” unless the hacker himself agrees to return the stolen Bitcoins. There are basically two ways a hacker could hack Bitcoin System for Stealing Bitcoins. He is either able to get the Blockchain password (Wallet key) of a User or a group of user and then use it to transfer all bitcoins from users wallet to his Anonymous Wallet. Or he could actually Hijack Bitcoin Mining Pool and redirect all of its computing power to Mine Bitcoins for himself.

Kaspersky Labs and INTERPOL have presented research in which they show how blockchain-based cryptocurrencies can potentially be abused with arbitrary data that can be disseminated through its public decentralized databases. An attack on “The DAO” took place on 17th June 2016. However, believe it or not, the developers did know of the vulnerability before that date (12th of June). One of the DAO’s creators, Stephan Tual, released a blog post where he explained that even though a recursive call bug exists in a similar smart contract framework (MakerDAO), the DAO is not at risk. Whilst the developers were postponing the fix, the network was compromised and 3.6 million ETH (approximate $53 million at the time) were drained from the DAO. To put it into perspective, this was a third of its resources. Security issues will likely always be present in the Bitcoin world, and users will have to rely on cybersecurity firms to constantly innovate and provide solutions.

From IBM’s perspective, industrial-grade blockchain technologies have the following characteristics:

  • A shared, permissioned ledger is the append-only system of record (SOR) and single source of truth. It is visible to all participating members of the business network.
  • A consensus protocol agreed to by all participating members of the business network ensures that the ledger is updated only with network-verified transactions.
  • Crytography ensures tamper-proof security, authentication, and integrity of transactions.
  • Smart contracts encapsulate participant terms of agreements for the business that takes place on the network; they are stored on the validating nodes in the blockchain and triggered by transactions

IBM is a premier code-contributing member of the Hyperledger Project, which is the Linux Foundation’s open source collaborative effort to create a blockchain for business-to-business (B2B) and business-to-customer (B2C) transactions. IBM has contributed 44,000 lines of blockchain code to the Hyperledger Project. IBM’s contributed code helps developers explore the use of blockchain in the enterprise as they build secure decentralized ledgers to exchange assets of value between participants. IBM’s proposed contribution is a “low-level blockchain fabric that has been designed to meet the requirements of a variety of industry-focused use cases. It extends the learning of the pioneers in this field by addressing additional requirements needed to satisfy those broader industry use cases.

Cognitive 101

This era will redefine the relationship between man and machine – Ginni Rometty – IBM CEO

Cognitive computing is one of the most exciting developments in software technology in the past few years. Conceptually, cognitive computing focuses on enabling software models that simulate the human thought process. Cognitive computing offers fundamental differences in how systems are built and interact with humans. Cognitive-based systems, such as IBM Watson, are able to build knowledge and learn, understand natural language, and reason and interact more naturally with human beings than traditional systems. They are also able to put content into context with confidence-weighted responses and supporting evidence. More specifically, cognitive computing enables capabilities that simulate functions of the human brain such as voice, speech, and vision analysis. From this perspective, cognitive computing is becoming an essential element to enable the next wave of data intelligence for mobile and IoT solutions. Text, vision, and speech are common sources of data used by mobile and IoT solutions. Cognitive systems can quickly identify new patterns and insights. Over time, they will simulate even more closely how the brain actually works. In doing so, they could help us solve the world’s most complex problems by penetrating the complexity of big data and exploiting the power of natural language processing and machine learning.

IBM points to a survey of more than 5,000 C-suite executives by its Institute for Business Value (IBV), which found the following:

  • Sixty-five percent of insurance industry CXOs are pursuing some form of business model innovation, but nearly 30 percent feel the quality, accuracy and completeness of their data is insufficient.
  • Sixty percent of retail executives do not believe their company is equipped to deliver the level of individual experiences consumers demand, and 95 percent say they will invest in cognitive in the next five years.
  • The healthcare industry forecasts a 13 million person gap in qualified healthcare workers by 2035, and more than half of healthcare industry CXOs report that current constraints on their ability to use all available information limits their confidence about making strategic business decisions. Eighty-four percent of these leaders believe cognitive will be a disruptive force in healthcare and 95 percent plan to invest in it over the next five years.

The story was similar across all industries: executives surveyed by the IBV cited scarcity of skills and technical expertise — rather than security, privacy or maturity of the technology — as the primary barriers to cognitive adoption.

The most popular cognitive computing platform in the market, IBM Watson provides a diverse number of APIs to enable capabilities such as vision, speech, text, and data analysis. Watson is now available to developers as part of the Watson developer cloud included in Bluemix distributions.

 

Good resource to learn more about Cognitive Computing, visit IBM:

http://www.research.ibm.com/cognitive-computing/index.shtml

http://www.ibm.com/cognitive/

Cognitive Internet of Things?

Internet of Things (IoT) represents the extension and evolution of the Internet, which has great potential and prospects for modern intelligent service and applications. However the current IoT is still based on traditional static architectures and models by our deep investigation. It lacks enough intelligence and cannot comply with the increasing application performance requirements. By integrating cognition into IoT, we present a new concept of Cognitive Internet of Things (CIoT) and its corresponding intelligent architecture.

Most of the current offerings from several point solution vendors for Internet of Things (IoT) focusses on how to connect devices to see, hear, smell the physical world around and report the observations. However, I would argue that only connectivity and reporting is not enough but capability to learn, think and understand both physical, social and contextual data and apply intelligence is the key. This requirement drives us to develop a new model called “Cognitive” Internet of Things. What is Cognitive? It is more appropriate to refer to “cognition” as an “integrative field” rather than a “discipline” since the study on “cognition” integrates many fields that are rooted in neuroscience, cognitive science, computer science, mathematics, physics, and engineering, etc.

Cognitive computing is one of the most exciting developments in software technology in the past few years. Conceptually, cognitive computing focuses on enabling software models that simulate the human thought process. More specifically, cognitive computing enables capabilities that simulate functions of the human brain such as voice, speech, and vision analysis. From this perspective, cognitive computing is becoming an essential element to enable the next wave of data intelligence for mobile and IoT solutions. Text, vision, and speech are common sources of data used by mobile and IoT solutions.

As per IEEE, Cognitive Internet of Things is a new network paradigm, where (physical/virtual) things or objects are interconnected and behave as agents, with minimum human intervention, the things interact with each other following a context-aware perception-action cycle, use the methodology of understanding-by-building to learn from both the physical environment and social networks, store the learned semantic and/or knowledge in kinds of databases, and adapt themselves to changes or uncertainties via resource-efficient decision-making mechanisms, with two primary objectives in mind:

  • bridging the physical world (with objects, resources, etc) and the social world (with human demand, social behavior, etc), together with themselves to form an intelligent physical-cyber-social (iPCS) system;
  • enabling smart resource allocation, automatic network operation, and intelligent service provisioning

The development of IoT depends on dynamic technical innovations in a number of fields, from wireless sensors to nanotechnology. Without comprehensive cognitive capability, IoT is just like an awkward stegosaurus: all muscle, no brains. To fulfill its potential and deal with growing challenges, we must take the cognitive capability into consideration and empower IoT with high-level intelligence.

Blockchain for Dummies!

Many companies are accepting bitcoins, many are not. Here is a list. These include Target, Tesla, Whole Foods, Microsoft, Home Depot, Intuit, Dell, PayPal/EBay, Sears, Bloomberg.com and many others.  With many companies accepting the change and others getting ready to, bitcoins are an extremely fast-spreading currency. The crypto-currencies have multiplied in the market place in recent years. QR codes are the biggest help in real-world bitcoin transfers. Using a smartphone and a Bitcoin wallet app, a user scans a label and presses a small buttoned aptly named “spend.”

Every transaction that happens between a buyer and seller or a transferor and transferee or between 2 members on the network, is verified and validated by “miners” to ensure it is secured and there is no risk of double spending. These miners are similar to VISA or MasterCard or Amex of the credit card world that provides a platform to exchange, validate and authorize. The miner creates a block of records which holds a copied record of all the verified transactions that have occurred in the network over the past ‘n’ minutes. Each transaction in every block is made at specific time and linked to previous block of transactions. Digital records are lumped together into “blocks” then bound together cryptographically and chronologically into a “chain” using complex mathematical algorithms. This encryption process, known as “hashing” is carried out by lots of different computers. If they all agree on the answer, each block receives a unique digital signature. The groups/chains of these blocks of transactions is referred to as Blockchain. The Blockchain is seen as the main technological innovation of Bitcoin, since it stands as proof of all the transactions on the network. Blockchain, or distributed ledger, technology is more secure, transparent, faster and less expensive than current financial systems. The distributed nature of a Blockchain database means that it’s harder for hackers to attack it – they would have to get access to every copy of the database simultaneously to be successful. It also keeps data secure and private because the hash cannot be converted back into the original data – it’s a one-way process.

In short, Blockchain is a method of recording data – a digital ledger of transactions, agreements, contracts – anything that needs to be independently recorded and verified as having happened. The big difference is that this ledger isn’t stored in one place, it’s distributed across several, hundreds or even thousands of computers around the world. In 2015, some of the leading financial institutions such as Visa, Goldman Sachs, Citi and other Wall Street incumbents joined venture capital firms to pour $488 million into the industry. In a World Economic Forum report released in September, “Deep Shift: Technology Tipping Points and Societal Impacts,” 58% of survey respondents said that they expected that by the year 2025, 10% of global gross domestic product will be stored on Blockchain technology. If banks started sharing data using a tailor-made version of Blockchain it could remove the need for middlemen, a lot of manual processing, and speed up transactions. If banks and other financial institutions are able to speed up transactions and take costs out of the system, it should mean cheaper, more efficient services for us.

 

The Big Data Institute – Top Ten 2016 Predictions

 customer experience predictions

Here are our predictions for 2016 that The Big Data Institute sees shaping your businesses:

1. Customer Digital Experience will take the center stage for companies competing to win mindshare and share of wallet. Majority of the customer transactions will be initiated through mobile platform – smart phones, tablets, phablets, wearable devices, etc

2. Analytics will become secret weapon for many companies with Big Data and Internet of Things projects on rapid rise. Data Management and Advanced Analytics will become more sophisticated but more business user friendly.

3. Privacy and Security will continue to be top priority for companies as consumers behavior and new laws will drive corporates towards compliance requirements.

4. Consumers will start monetizing their own data using various mediums.

5. Leadership ranks and roles will be transformed with primary corporate roles as – Chief Data Officer, Chief Analytics Officer and Chief Intelligence Officer becoming dominant roles driving IT and Business.

6. Artificial Intelligence, Robotics, Cognitive Computing will be at the top of Hype Curve as companies start exploring and piloting AI, cognitive solutions while Big Data and IOT move towards plateau.

7. M&A on the rise with several acquisitions in IOT, Digital/Mobile and Security solutions.

8. Cloud will become the new standard and will become first choice in many cases for platform.

9. Businesses making IT purchase decisions (solution, software, and project) will be on the rise. In some companies, IT may get decentralized to address the speed and agility requirements.

10. Industry will move more towards pre-packaged/prebuilt solutions include point solutions building towards transformation project instead of large in-house builds.

Cognitive Computing: From hype and pilot to going mainstream implementation

Cognitive Computing started in late 1960’s with initial innovation around supercomputers. However the decision making capability, simulation and self learning intelligence was limited. With major advances in computing power, cognitive behavior science, computational intelligence, the researchers in Cognitive Computing made significant advancement. With advanced medical science and neuropschology, computer scientists were able to study the mechanics of human brain and that allowed the scientists to build computative models modeled after human mind. These models would allow association with various attributes and parameters from past experiences into cognitive systems. Cognitive computing and Decision Science was born with scientists developing computers that that operated at a higher rate of speed and accuracy than the human brain did.

As per IBM, Cognitive computing systems learn and interact naturally with people to extend what either humans or machine could do on their own. They help human experts make better decisions by penetrating the complexity of Big Data. Big Data growth is accelerating as more of the world’s activity is expressed digitally. Not only is it increasing in volume, but also in speed, variety and uncertainty. Most data now comes in unstructured forms such as video, images, symbols and natural language – a new computing model is needed in order for businesses to process and make sense of it, and enhance and extend the expertise of humans. Rather than being programmed to anticipate every possible answer or action needed to perform a function or set of tasks, cognitive computing systems are trained using artificial intelligence (AI) and machine learning algorithms to sense, predict, infer and, in some ways, think.

Cognitive computing reassess the nature of relationships between multiple variables and the environment factors. It can help assess the nature of the relationship between people and their increasingly pervasive today’s digital environment. They may play the role of assistant or coach for the user, or they may act virtually autonomously in many problem-solving situations. The boundaries of the processes and domains these systems will affect are still elastic and emergent. They must learn as attribures and variables changes, and as questions and decisions evolve. They must resolve ambiguity and tolerate unpredictability. They must be engineered to feed on dynamic data in real time, or near real time. They must interact easily with users so that those users can define their needs comfortably. They may also interact with other processors, devices and cloud services, as well as with people. They must aid in defining a problem by asking questions or finding additional source input if a problem statement is ambiguous or incomplete. They must “remember” previous interactions in a process and return information that is suitable for the specific application at that point in time. They must understand, identify and extract contextual elements such as meaning, syntax, time, location, appropriate domain, regulations, user’s profile, process, task and goal. They may draw on multiple sources of information, including both structured and unstructured digital information, as well as sensory inputs (visual, gestural, auditory or sensor-provided). The output from Cognitive systems may be prescriptive, suggestive, instructive or simply entertaining.

Cognitive informatics is a cutting-edge and multi-disciplinary research field that tackles the fundamental problems shared by modern informatics, computation, software engineering, AI, computational intelligence, cybernetics, cognitive science, neuropsychology, medical science, systems science, philosophy, linguistics, economics, management science, and life sciences. The development and the cross fertilization between the aforementioned science and engineering disciplines have led to a whole range of emerging research areas known as cognitive informatics.

Sample Cognitive Computing Use Cases

  1. Shopping: Cognitive computing systems’ ability to evaluate and generate hypotheses will help retail industries to find patterns, correlations and insights in mountains of unstructured and structured data. Watson’s app development platform is already moving into this physical-virtual space. The startup, Fluid, has layered Watson on top of its Expert Personal Shopper app for retail brands. Watson will be your personal shopping assistant. Store associates will also have similar intelligent tech providing them instant product information, customer loyalty data, sales histories, user reviews, blogs and magazines, so that when you do need to talk with another human, they know exactly how to help.
  2. Medical: evidence-based learning, hypothesis generation and natural-language skills could help medical professionals make key decisions in patient diagnosis and treatment. The objective is to give Doctors and Surgeons a quick way to access diagnostic and treatment options based on updated research.
  3. Banking: In fraud detection, financial institutions could have cognitive tools that enable them to go beyond analyses of cardholders’ credit transaction histories; cognitive computing might provide them with new “associational” intelligence, such as when an individual is most likely to make purchases, what he is likely to buy, and under what circumstances.
  4. Finance: Benefits from Cognitive Computing also will be seen by financial advisors, including individuals that handle their own portfolios, as the technology enables bringing together relevant, current and personalized information and remembering questions and preferences
  5. Weather Forecasting and Planning: Weather forecasting can benefit from Cognitive Computing and big data analytics. For instance, IBM’s Deep Thunder, a research project that creates precise, local weather forecasts, can predict severe storms in a specific area up to three days before the event. This early-warning system gives local authorities and residents enough time to make preparations

As per Deloitte University Press – Cognitive analytics can help address some key challenges. It can improve prediction accuracy, provide augmentation and scale to human cognition, and allow tasks to be performed more efficiently (and automatically) via context-based suggestions. For organizations that want to improve their ability to sense and respond, cognitive analytics offers a powerful way to bridge the gap between the promise of big data and the reality of practical decision making.

IBM has calculated that the market size for cognitive computing services in the private sector is in the neighborhood of $50 billion. At present, there are very few vendors in the field. While IBM has announced the creations of the Watson Group to commercialize cognitive computing and Google has acquired AI startup Deepmind, there are few companies in the space. Much of the work is still happening at a university level or within research organizations. Cognitive computing is still early from a commercialization perspective. It is likely another three to five years before an industry wide adoption and its impact on a wide range of companies. For a while at least, cognitive computing will fill a niche in industries where highly complex decision making is the norm, such as healthcare and financial markets. Eventually, it will become a normal tool in every corporate toolbox, helping to augment human decision making.

Top 10 Big Data and Analytics Predictions for 2015

Predictions for 2015

Forrester’s forecast that Hadoop will become an “enterprise priority” in the next 12 months, International Data corp. has just gazed into its own crystal ball, and sees a future where spending on Big Data analytics is set to grow three times faster in 2015.

Here’s our prediction of Top 10 Big Data and Analytics trends for 2015

  1. Rise of Chief Data Officer and Data Scientists positions
  2. Cloud Computing and Cloud Data Warehouse will grow adoption
  3. Data Visualization tools will become very popular with business users with advanced visualization capabilities
  4. Data Integration Hub at most large and mid-scale enterprise will expand to include unstructured data and incorporate advanced and predictive analytics using machine learning and advanced capabilities such as Watson
  5. Integration with Enterprise Mobile Apps and Big data technology on the rise
  6. Organizations will move from data lakes to processing data platform. Self-service Big Data with predictive analytics and advanced visualizations will go mainstream
  7. Companies and organizations will struggle to find data engineers and data scientist talent. Many legacy ETL/Integration and EDW resources will fill the gap.
  8. The Internet of Things (IoT) will be the next critical focus for data/analytics services
  9. Data Providers will evolve rapidly into new business model
  10. Data Privacy and security will gain momentum in implementation

While we can’t know for sure if each of these things will come true, we do know that the world of big data is changing. It’s no longer about just having access to data and the ability to store it, but instead the ability to achieve actionable results with data through predictive and prescriptive analytics. Only time will tell how this evolves, but if you aren’t leveraging data to compete and win, it’s time to get on board. The big data and analytics market will reach $125 billion worldwide in 2015, according to IDC and if you are not investing or planning to invest, you are already behind your competitor or losing your spot from being an A player.