Don’t just call it “Big Data”
Whenever a new idea or
method of improving a company is very successful many people begin to use the
term for everything so that it will look like their company is up to date on
the latest technology. Unfortunately, this has begun to occur with big data
analysis as well. Several companies have started calling all of their data
analysis big data. Some of this is just
because they don’t know better and some are doing it because they want to
appear even with their competitors.
Once a new idea is successful and everyone starts using it the idea starts to lose some of its meaning. This isn’t because it isn’t a great tool but many misuse the terminology and this results in many overlooking the significance of real big data analysis.
Many experts have even started to turn against big data because of its misuse. I found a great article on this topic and some of the expert statements are below:
- “Every so often a term becomes so beloved by media that
it moves from ‘instructive’ to ‘hackneyed’ to ‘worthless,’ and Big Data is
one of those terms….” Roger
Ehrenberg
- “Every product by every vendor supports big data… and
every ‘industry leader’ with every talk needs to include the phrase in the
title of their talk and repeat it as many times as possible. So every data
warehouse pitch is rehashed as a big data pitch, every data governance,
master data management, OLAP, data mining, everything is now big data.” Rob Klopp
- “Big data as a technological category is becoming an
increasingly meaningless name.” Barry Devlin
You can read more from this article at:
A great article supporting my theory can be read at:
http://blogs.sas.com/content/corneroffice/2012/02/03/is-big-data-over-hyped/
It's the buzz-word effect. Granted, it could be that they simply don't know that their data analysis isn't big data.
ReplyDeleteThe best example I can think of is the "High-Definition" marketing that most are still oblivious to. Manufacturers consider having a picture width of at least 720 pixel lines is considered "HD". What people perceive as HD isn't due to resolution, but mainly heavily due to bitrate. Simply put, bitrate is the amount of video information. Maybe you've noticed how some "HD" channels vary in quality. Fun fact: Most of the time cable companies offer you more "FREE HD CHANNELS!", it's because they reallocated the bitrate, which in turn affects the overall HDness.
More bitrate = more HD goodness.
Couple that with high resolution and you have yourself every HDTV show floor.
One can say that big data is/has turned into what high def has become. Not in the sense of fooling customers, but on keeping face among your peers.
Interestingly, bitrates could represent the actual analysis of the data. This doesn't have much to do with the article but it wraps up my tangent/metaphor quite well.
Any television can claim to be high definition, but it's their bitrates that matter.
Any company can claim to do big data analysis, it's the information inferred that matters.
There's correlation in there somewhere.