SAN MATEO, Calif.--(Athena IT Solutions, which provides business intelligence, data integration and data warehouse consulting and training, has published two new white papers describing a new generation of analytical sandboxes and analytical hubs and their emerging role in enabling agile analytics.)--Richard Sherman, founder of
“These papers are an essential resource for anyone working with analytics, especially when faced with the challenges of big data”
The white papers, sponsored by Composite Software, are a follow-on to Sherman’s previously published A Better Way to Fuel Analytical Needs, where Sherman writes that the potential value of business analytics and the amount of data to fuel it are significantly expanding, yet business intelligence and data-integration backlogs are constraining enterprises attempting to tap that value.
Sherman believes that two architectural frameworks, analytical sandboxes and analytical hubs, form the foundation for the kind of agile, self-service data integration required when developing new analytics. The new papers focus on the specific business needs and technology solutions—including data virtualization—for implementing analytical sandboxes and hubs. Analytics Best Practices: The Analytical Sandbox and Analytics Best Practices: The Analytical Hub are the latest additions to the Composite Software Data Virtualization Leadership Series,™ a forum for articles and webcasts by leading IT industry analysts and advanced data virtualization practitioners.
Sherman, who has more than 25 years of experience in BI solutions and is an author and frequent speaker on IT topics, writes that because of the constant influx of data and ever-changing business environment, business analysts need to access increasingly diverse data sets, inside and outside their organization. The answer, Sherman asserts, is analytical sandboxes and hubs, a new paradigm that addresses the multiple-query challenges of situational business analytics and provides enterprise-scale processing, storage and networking capabilities while avoiding the pitfalls of makeshift data shadow systems.
According to Sherman, “Data-integration capability will expand beyond traditional ETL to include data virtualization, which enables organizations to expand the data used in their analysis without requiring that it be physically integrated. Companies do not have to get IT involved (via business requirements, data modeling, ETL and BI design) every time data needs to be added. This iterative and agile approach supports data discovery more productively for both business and IT.”
“These papers are an essential resource for anyone working with analytics, especially when faced with the challenges of big data,” said Robert Eve, Composite Software Executive VP of Marketing. “We are pleased to underwrite this trio of papers, which clearly detail how organizations can achieve business agility through the implementation of analytical sandboxes and hubs utilizing data virtualization.”
Composite Software, Inc. is the data virtualization market leader. The Composite Data Virtualization Platform’s streamlined approach to data integration helps organizations gain more insight from their data, respond faster to ever-changing analytics and BI needs, successfully evolve their data management approach and save 50-75 percent over data replication and consolidation.
Composite Software is privately held, with corporate headquarters in San Mateo, CA.
To contact Composite, please call 650-227-8200, visit us on the Web at http://www.compositesw.com, or follow us on Twitter http://twitter.com/compositesw. To learn more about data virtualization, visit the DV Café microsite, the Data Virtualization Channel and the Data Virtualization Leadership Blog.
Composite Software is a registered trademark of Composite Software, Inc. Copyright © Composite Software, Inc. 2013.