Generally speaking, to understand how Big Data works it is a good idea to have common knowledge of the history, definition and what makes up the term.  Even then, it is probably a good rule of thumb to stay away from the word.  Demystify and uncomplicating the message, and keeping your definition cognizant, but, since it is a buzz word and because it is a solution centering on data and the cloud I have included it on this site.

Big Data and The History

The Idea of gathering, storing and analyzing large amounts of data is not something new; It is a task order that had been going on long before early 2000 when Doug Laney coined the term “Big Data” and the three Vs.


Volume is the amount of data from a variety of sources, including; business transactions, social media and information from sensor or machine to machine data.  Storage systems of the past cannot handle the sheer Volumes and storage was just not economically feasible.


Data streams today are at unprecedented speed and need to be treated promptly.  Streams such as RFID tags, smart meters, video, and images require collection, cleaned and staged for near real time use.


Keep the three Vs in mind,  Big Data is the sheer volume of data that involves structured, semi and unstructured – capturing the velocity and variety of data, cleansing and staging it for use.   It is not the amount of data that is important here.  It what an organization does with the data is the focus. Organizations can collect data and still analyze it to find cost reduction.

Big Data Importance

Is to probe for insights that help an organization gain greater insight that leads to better decisions and strategic improvement.

Types of Data

Structured Data

Structured Data is a schema that defines fields of data, how the data is management maintained.  Data types include numeric, currency, alphabetic, name, data address.  Data Governance determines the schema and restrictions on the data input on structured data. This is where Data Quality comes into the picture, to avoid redundancy and to have to deal with deduplication. Data Quality is critical to Business Intelligence and Business Analytics.

Unstructured Data

Is Defined as any data that does not easily classified or fit into a schema of Structured data.  Some examples of unstructured data include;  photos, video, satellite imagery, mobile data, internal chat logs, internet logs, website, data, social media and the list goes on.

Big Data Experts in the DFW Meet Here

In The Dallas Fortworth Metroplex

Come join us for discussions, presentations, and networking for use-cases, design, technologies, etc centered around all things NoSQL and BigData! It also services as a HUG (Hadoop Users Group) for DFW.

RSS BigData
  • Slovak Cellular Machine-to-Machine Market is Larger than You Think
    In this Insight, IDC presents a summary of its findings about the Slovak M2M market, garnered from interviews with relevant market players and other secondary sources during the course of 2014 and 2015. We provide our estimates on the market size in terms of M2M SIM volume; we also discuss the maturity of M2M applications […]
  • NetApp Expands Its All-Flash Offerings with the Acquisition of SolidFire
    This IDC Flash discusses the expansion of NetApp's portfolio of all-flash offerings through the acquisition of SolidFire. On December 21, 2015, NetApp announced the pending acquisition of SolidFire, an all-flash array (AFA) vendor, in an all-cash deal valued at $870 million. This development caps a year of maturation for the AFA market that saw the […]

Go Ahead Share It!