Mouser Left Banner
Mouser Left Banner
Mouser Left Banner
Mouser Right Banner
Mouser Right Banner
Mouser Right Banner
More

    Cognitive Computing: What, Why and Where is it trending?

    In today’s world, cognitive computing is a buzzword encompassing a plethora of technological approaches that automatically extract concepts and relationships from data, understand their meaning, and learn independently.

    What is Cognitive Computing?

    The term ‘Cognitive Computing’ was coined by IBM researchers while working on Watson for the ‘Jeopardy’ challenge. Cognitive computing involves a computational model which is made to learn a problem to simulate human thinking, by training it on relevant data associated with a problem. By training on data with labeled output, the model or network learns a set of parameters or weights. The fully trained model can be deployed to make inferences using the learned weights to do operations like classification, detection,and more. Machine Learning (ML) and Deep Learning (DL) algorithms are used to create these models. Neural networks which are the building blocks of deep learning networks can be considered as the cognitive system’s brain.These Artificial Intelligence(AI)  based systems being a combination of computer science and cognitive science extract insights from data and solve problems to achieve optimized human workflow using self-learning algorithms that rely on data mining, visual recognition, speech recognition, natural language processing, sentiment analysis etc.

    In recent years cognitive computing is evolving with uninterrupted momentum, primarily because of 3 reasons.

    • Internet and data abundance: Exabytes of data in various formats like text, images, video and audio, strewn across the Internet are a very robust training source. Even the human brain which achieves its capability to categorize things genetically, learns by seeing things multiple times. As Kevin Kelly aptly describes “Massive databases, self-tracking, web cookies, online footprints, terabytes of storage, decades of search results, Wikipedia, and the entire digital universe became the teachers making AI smart”.
    • Parallel computing and cheap hardware: To successfully implement training on deep networks of multiple layers, huge processing power is required. This computational power is available in the form of GPUs, which are familiar to us in the form of special chips widely used in PCs and video gaming consoles. GPUs can speed up the training of neural networks nearly a hundredfold. NVIDIA has realized this potential and has created GPUs like TITAN for desktop applications, DGX-1 and Tesla series as data centre solutions, Jetson TX1,TX2 and the DRIVE-PX series for embedded application development, etc. Even GPU accelerated cloud services like Amazon Web Services, Microsoft Azure, Google Cloud, and IBM Cloud have also gained popularity.
    • New and better algorithms: Reliable and robust algorithms extract insights and relations from a data avalanche. Thus, new mathematical models are being introduced for creating networks, while global universities are still researching to create more robust networks for improving accuracy and performance.

    Tools & Technologies

    In order to create a flourishing environment for cognitive computing, software frameworks and structured & unstructured data toolkits are very important. Besides, high-performance computing hardware, flexible interfaces, Big Data, and cloud integration, and rapid model deployment are the other requisite factors. Resultantly, several toolkits like IBM Watson, Google DeepMind, and Microsoft Cognitive Services are gaining universal attention from developers for providing the ecosystem to develop cognitive computing applications. Even technologies such as Elastic search, NoSQL, Hadoop, Kafka, Spark,etc. are partly forming the cognitive system capable of executing in various cloud platforms that could handle dynamic real-time data and static historical data.

    As the growing data volumes lead to increased computing workloads, the software ecosystem must be able to leverage the power of distributed computing and heterogeneous computing based on CPUs, GPUs, DSPs, and FPGA. Practically, ML software must be able to work with distributed data and also distribute its workload over many machines for infinite scalability. They should also have a flexible interface to integrate various analytic languages like R, Python, and Scala, for business users to visualize predictive performance and model characteristics.

    Use case scenarios

    With the data from which the computational models can learn and infer, usually being in the form of sound, time series, text, image, and video, DL networks can efficiently identify patterns in unstructured data. Audio files’ datais used for voice recognition and voice search applications.

    Based on time series data, various DL applications can be developed for log analysis and risk detection for data centers, security, and finance, enterprise resource planning in manufacturing and supply chain.Analysis of textual data using deep learning can be used for sentiment analysis, augmented search, theme detection, threat detection, and fraud detection. They find application in CRM, social media, finance, government and insurance sectors.

    End-to-end solutions leverage cognitive computing and big data analytics to expedite business process lifecycles through automation and simplification of a host of activities that currently involve complex decision-making. Conversational software robots embedded in such solutions process data from diverse sources, delivering actionable insights to enterprise users. These self-learning bots evolve continuously by assimilating precedents, thus making troubleshooting and automation more targeted and accurate over time.

    Computer vision examples include face detection, object detection, machine vision, photo clustering, etc. These applications are widely used in automotive, healthcare, social media, consumer electronics etc. Video applications include motion detection and real-time threat detection. These applications are widely used in surveillance and security, aviation, drones, robotics, and automotive domains.

    Organizations having large data centers and computer networks can use data mining on log files to detect any threats. Similarly, vehicle manufacturers and fleet operators use deep learning to mine sensor data to predict part and vehicle failure. Companies with large and complex supply chains can use this technology to predict delays and bottlenecks in production. Machine intelligence systems can be used to analyze phenotypic and genetic images. Cognitive strategy and insights can be applied to the claims process to provide claims reviewers with greater insight into each case.

    Challenges

    The transparency and trustworthiness of data based decision-making is the primary challenge of cognitive computing. With results dependent on the quality and quantity of available training data, combining private and public data is often needed, and this should be done without loss of confidentiality while honoring privacy. To make use of the full power of cognitive computing, functional silos need to be eliminated by enterprises, and data should be made available across the enterprise for machines to interact with and learn from.

    ELE Times Bureau
    ELE Times Bureauhttps://www.eletimes.com
    ELE Times provides a comprehensive global coverage of Electronics, Technology and the Market. In addition to providing in depth articles, ELE Times attracts the industry’s largest, qualified and highly engaged audiences, who appreciate our timely, relevant content and popular formats. ELE Times helps you build awareness, drive traffic, communicate your offerings to right audience, generate leads and sell your products better.

    Technology Articles

    Popular Posts

    Latest News

    Must Read

    ELE Times Top 10