We developed a “generic” custom big data software solution which provided a plethora of custom big data collection configuration options. The configuration options provided legal web scraping via website approval keys, similar to how Google Webmaster Tools ensures ownership. Configuration options also included REST interfaces for posting named pairs with pass-through authentication. Data could also be collected via form posts, flat files such as .CSV or XLS, SQL data transfers (via SQL SSMS User access) and various types of API’s. The SQL database structure was proprietary utilizing SQLCLR to ensure tables, queries, design and working secrets were not exposed.
We created various windows services to perform unattended automated data collection. Automated services ground the high velocity, complex and variable data, drilling it down into a 3 levels of analytical phases (unstructured data analytics). Once prepared, custom software services performed calculations with means, averages, with hi and low value thresh-holds optionally flagged as ignores. Upon completion of this step, custom views were populated, aided by solution specific modules. Views supported various forms of filtering (date range, regional, types, etc.). In summary, this “generic” big data custom solution provided in-depth automated analysis of disparate yet related data, with advanced blending algorithms and rules to identify trends, medians, probabilities and predictive analysis. Note Campbell Software was chosen to develop this solution to to our extensive SQL experience, our Analytics experience and our business reporting design experience. This was a mid-size project (6 figures).