Manage Learn to apply best practices and optimize your operations.

Cloud Computing Expo raises awareness of change

This blog post is part of our Essential Guide: How to solve your TMI problem: Data science analytics to the rescue

This week’s Cloud Computing Expo in New York isn’t among the IT industry’s largest gatherings, but it’s one of the most important. With nary a CFO or CIO among the attendees, this is a gathering aimed squarely at those of you working down in the trenches of cloud application development, testing, and operations. Walk into almost any educational session and you’d hear about IoT, Industrial IoT, storage and how to deal with huge amounts of it, testing, microservices, and containers.

One shift we are likely to see over the next few years is that of cognitive computing, where the nature of the data dictates how it should be handled. For decades, we’ve designed and built applications that process data in predetermined ways, based on our expectations of what the data looks like or should look like. What I’ve learned is that we are in a new era, where forces like social media, text messaging, multimedia, and sensor readings are delivering to applications data that is not just unstructured, but wildly variable in its content and format. What cognitive software needs to do is be ready when the data says “here’s what you need to do with me.” That is the essence of cognitive computing, where  the data directs how the application should function. In other words, traditional applications are based on the past and cannot anticipate the future.

As cloud consultant Judith Hurwitz put it in her presentation, “cognitive computing is a problem-solving approach that uses hardware or software to approximate the form or function of natural cognitive processes.” What that means is systems designed based on data and letting the data lead the logic. Systems end up being designed to morph as they learn about the ever-changing patterns of data. And here’s a good one: Cognitive systems assume there is not just one correct answer, but is more probabilistic in nature, using hypotheses based on the data itself. At its core, this is machine learning where a system’s performance can improve based on exposure to patterns in the data rather than on explicit program code.

One thing I found interesting is that several presentations at Cloud Computing Expo touched on this idea of cognitive computing without ever using that particular label. As one presenter said, when you have an influx of text-based data, the processing applications must have the capability of differentiating between “feet” as a unit of length and as those things at the ends of your legs. Same word, different meaning. Cognitive systems can learn from such patterns, usages, and anomalies, and they can morph or evolve as more data is ingested and analyzed.

For the applications that you develop, does this thinking make sense? Are you building cognition facets into your work as a means of being able to do something, anything, with data that seemingly has no historical antecedents? Share your thoughts about the rapidly advancing field of cognitive computing. We’d like to hear from you.

Start the conversation

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

-ADS BY GOOGLE

SearchAWS

TheServerSide

SearchFinancialApplications

SearchBusinessAnalytics

SearchCRM

SearchSalesforce

Close