Did you know that this year the amount of global digital data generated is expected to reach about 2.7 zettabytes? That number represents a 48 percent increase from 2011. Most people have trouble comprehending the size of a terabyte, let alone a zettabyte. But those of us in the biz are finding...
In the financial services industry, thousands of users regularly rely on the ability to accurately and quickly access data in order to run business-critical reports and amend databases. It’s for this reason that it’s imperative to business success and user efficiency that financial institutions have...
The concepts of Big Data have been around well before the introduction of the Hadoop file system. Back when I was in college (ancient times, according to my family) and pursuing my degree in astrophysics, I wrote a thesis on ‘stellar spectroscopy’ – researching the spectrum of data from...
There have been many relatively recent dramatic improvements in hardware. Factors such as processor speed/design and network capacity appear to be making shortages of hardware resources very pre-2011. Not to mention falling prices that appear to not have hit bottom just yet. But with improving...
Over the years, companies can build up some pretty junky architecture, cobbled together from a combination of existing bulk load tools such as the BCP Utility, dbload or FTP servers. Not to mention, compressing, FTP-ing, converting, shredding, parsing, optimizing, loading and validating bulk data is...