The great Danish physicist Niels Bohr once observed that “prediction is very difficult, especially if it’s about the future.” Particularly in the..
he great Danish physicist Niels Bohr once observed that “prediction is very difficult, especially if it’s about the future.” Particularly in the ever-changing world of technology, today’s bold prediction is liable to prove tomorrow’s historical artifact. But thinking ahead about wide-ranging technology and market trends is a useful exercise for those of us engaged in the business of partnering with entrepreneurs and executives that are building the next great company.
Moreover, let’s face it: gazing into the crystal ball is a time-honored, end-of-year parlor game. And it’s fun.
So in the spirit of the season, I have identified five big data themes to watch in 2015. As a marketing term or industry description, big data is so omnipresent these days that it doesn’t mean much. But it is pretty clear that we are at a tipping point. The global scale of the Internet, the ubiquity of mobile devices, the ever-declining costs of cloud computing and storage, and an increasingly networked physical word create an explosion of data unlike anything we’ve seen before.
The creation of all of this data isn’t as interesting as the possible uses of it. I think 2015 may well be the year we start to see the true potential (and real risks) of how big data can transform our economy and our lives.
Big Data Terrorism
The recent Sony hacking case is notable because it appears to potentially be the first state-sponsored act of cyber-terrorism where a company has been successfully threatened under the glare of the national media. I’ll leave it to the pundits to argue whether Sony’s decision to postpone releasing an inane farce was prudent or cowardly. What’s interesting is that the cyber terrorists caused real fear to Sony by publicly releasing internal enterprise data — including salaries, email conversations and information about actual movies.
Every Fortune 2000 management team is now thinking: Is my data safe? What could happen if my company’s data is made public and how could my data be used against me? And of course, security software companies are investing in big data analytics to help companies better protect against future attacks.
Big Data Becomes a Civil Liberties Issue
Data-driven decision tools are not only the domain of businesses but are now helping Americans make better decisions about the school, doctor or employer that is best for them. Similarly, companies are using data-driven software to find and hire the best employees or choose which customers to focus on.
But what happens when algorithms encroach on people’s privacy, their lifestyle choices and their health, and get used to make decisions based on their race, gender or age — even inadvertently? Our schools, companies and public institutions all have rules about privacy, fairness and anti-discrimination, with government enforcement as the backstop. Will privacy and consumer protection keep up with the fast-moving world of big data’s reach, especially as people become more aware of the potential encroachment on their privacy and civil liberties?
Open Government Data
Expect the government to continue to make government data more “liquid” and useful – and for companies to put the data to creative use. The public sector is an important source of data that private companies use in their products and services.
Take Climate Corporation, for instance. Open access to weather data powers the company’s insurance products and Internet software, which helps farmers manage risk and optimize their fields. Or take Zillow as another example. The successful real estate media site uses federal and local government data, including satellite photography, tax assessment data and economic statistics to provide potential buyers a more dynamic and informed view of the housing market.
Even as we engage in a vibrant discussion about the need for personal privacy, “big data” pushes the boundaries of what is possible in health care. Whether we label it “precision medicine” or “personalized medicine,” these two aligned trends — the digitization of the health care system and the introduction of wearable devices — are quietly revolutionizing health and wellness.
In the not-too-distant future, doctors will be able to create customized drugs and treatments tailored for your genome, your activity level, and your actual health. After all, how the average patient reacts to a particular treatment regime generically isn’t that relevant; I want the single best course of treatment (and outcome) for me.
Health IT is already a booming space for investment, but clinical decisions are still mostly based on guidelines, not on hard data. Big data analytics has the potential to disrupt the way we practice health care and change the way we think about our wellness.
Digital Learning, Everywhere
With over $1.2 trillion spent annually on public K-12 and higher education, and with student performance failing to meet the expectations of policy makers, educators and employers are still debating how to fix American education. Some reformers hope to apply market-based models, with an emphasis on testing, accountability and performance; others hope to elevate the teaching profession and trigger a renewed investment in schools and resources.
Both sides recognize that digital learning, inside and outside the classroom, is an unavoidable trend. From Massive Open Online Courses (MOOCs) to adaptive learning technologies that personalize the delivery of instructional material to the individual student, educational technology thrives on data. From names that you grew up with (McGraw Hill, Houghton Mifflin, Pearson) to some you didn’t (Cengage, Amplify), companies are making bold investments in digital products that do more than just push content online; they’re touting products that fundamentally change how and when students learn and how instructors evaluate individual student progress and aid their development. Expect more from this sector in 2015.
Now that we’ve moved past mere adoption to implementation and utilization, 2015 will undoubtedly be big data’s break-out year.
Featured Image: faithie/Shutterstock
Tech More: Google Amazon
The Single Most Terrifying Trend Facing Google
Two and a half years ago we wrote a post headlined “Forget Apple, Forget Facebook: Here’s The One Company That Actually Terrifies Google Execs.”
That company? Amazon.
Google is a search company, but the searches it makes money from are the searches people do before they are about to buy something online.
These commercial searches make up about 20% of total Google searches. Those searches are where the ads are.
Two and a half years ago we wrote, “What Googlers worry about in private is a growing trend among consumers to skip Google altogether, and to just go ahead and search for the product they would like to buy on Amazon.com, or, on mobile in an Amazon app.”
We noted that, according to ComScore, “the trend is real.” Searches on Amazon.com were up 73% year over year.
Well, we checked back with ComScore recently, and the news remains bad for Google. Desktop search queries on Amazon increased 47% between September 2013 and September 2014, according to ComScore.
Even worse for Google, that number doesn’t tell the whole story.
In the past two and a half years, the number of mobile internet users surpassed desktop internet users.
desktop versus mobile users in 2014Comscore
On mobile, using Google as a starting point when you want to buy something makes even less sense.
Think about it. Why go through these steps?
Open your web browser on your phone.
Google search “bike gloves.”
Scan some text links.
Click on a link to go to a product page at some e-commerce store.
Click to add the item to your cart.
Input your credit-card info.
Type in your address.
Select the shipping preferences you want to pay for.
When you can just …
Open the Amazon app on your phone.
Search “bike gloves.”
Click one button to buy the product with your usual credit card, and have it shipped to your usual address free.
Two and a half years ago, we wrote that Google’s Amazon nightmare would get scarier if Amazon’s Kindle Fire tablets and (rumored) phones ever got wide adoption.
That hasn’t happened yet. Kindle Fire sales are pretty bad. But earlier this month, Amazon CEO Jeff Bezos made it clear in an onstage interview at our BI Ignition conference that he’s not giving up on the project.
Bad news for Google execs trying to get eight hours a night.
Nicholas Carlson is the author of “Marissa Mayer and the Fight to Save Yahoo!”
From natural language processing to image recognition, there are a variety of technologies, each suited to different purposes, comprising today’s smart machines landscape, CIO Journal Journalist Thomas H. Davenport writes. Maybe some time we will have a system to recommend the best smart machine technology for the desired application. Until then “smart humans” have a role to play.
ut what specific technologies are we dealing with here, and what the heck do we call them? “Artificial intelligence” has been bandied about for a while, and it’s become an umbrella term for a lot of different tools. Perhaps its only problem is that it’s old, and we have heard so many times of its rise (and fall) that many have grown weary and skeptical of the term.
The newest umbrella term is “cognitive computing,” suggesting that we are finally developing computers that can mimic the human brain. The only problem with this term comes when you have an extended conversation with a neuroscientist, and you realize that we still know very little about how the brain works. There are articles attesting to our ignorance here and here. So to refer to these smart machines as examples of cognitive computing is not really very smart.
The truth is that there are a variety of technologies that comprise the current landscape of smart machines (the overview term that I find least problematic). Each is suited to a different set of purposes, and it’s pretty rare that more than one is integrated in a particular application. Unlike the human brain—which can perform a variety of cognitive tasks, a particular computer system can only do one type of task. Here’s an incomplete alphabetical list of the systems that can undertake cognitive tasks:
Analytics—Tools that do mathematical or statistical analysis on structured data—typically numbers. These have grown in power and sophistication over recent years, and can now be used to drive a variety of decision types. They are also increasingly embedded in other systems and business processes.
Complex event processing—This type of system takes as inputs a variety of real-time data sources and types about events, and then combines them to determine their significance or to take action. It takes in data, transforms it as necessary, analyzes it to detect trends and patterns, and takes necessary actions. CEP systems are widely used in algorithmic stock trading, in which a system might monitor a variety of economic and social indicators, and then determine that a stock trade would be economically beneficial. CEP is also used to detect credit card fraud—ideally before it is successfully committed.
Image recognition—Early systems for recognizing images were quite limited. But now that computers have become a lot more powerful, they can identify more complex images, including specific faces and types of animals. Google was able, for example, to build a complex image recognition that could identify cats in videos.
Machine learning/neural networks—These are somewhat automated approaches to analytics. They create models to fit data and improve as they learn. There are various forms of machine learning approaches, including neural networks, Bayesian classifiers, decision trees, support vector machines, and so forth. The differences between these are generally perceptible only to specialists.
Natural language processing—These tools take text or speech as input, and increasingly can “read” or extract meaning from it. IBM Corp.’s original Watson falls into this category, as does Apple Inc.’s Siri. Since much of human experience is represented in language, this is a powerful category, and the one most likely to be described as “cognitive computing.”
Rules and business rules–Rules express logic in a structured language—typically an “if/then” structure. They were the primary programming approach for so-called “expert systems,” a branch of artificial intelligence. Business rules express operational approaches to business in this structure. A business rule might specify how customers are to be treated (e.g., a customer returning an item doesn’t have to go through a credit check), or when certain quantitative thresholds are reached (e.g., a mortgage loan can be given if the loan-to-value ratio for the house in question is less than 20%).
These and other categories of smart machines can now address almost any topic on which there is data or recorded expertise. They’re all useful, but they’re not universally useful. Since each tool is suited only to a particular purpose, managers increasingly need to be familiar with the tools and how they fit particular applications. If you’ve got a problem that can be structured in a set of rules, you don’t want to hire Watson for that job. If you have a bunch of data in rows and columns, a rules engine won’t help you much.
One thing that human brains—at least some of them—are good at is seeing the big picture. That can include looking over the variety of technology alternatives for cognitively-oriented situations, and selecting the right one. We could probably have a system—I am envisioning a set of rules—that asks the business user a set of questions about the desired application, and then recommends a particular technology. But we don’t have that yet. So in the short run we will have to rely not on smart machines, but on smart humans, for this purpose.
Thomas H. Davenport is a Distinguished Professor at Babson College, a Research Fellow at the Center for Digital Business, Director of Research at the International Institute for Analytics, and a Senior Advisor to Deloitte Analytics.