On May 7, IBM announced extended functionality for its highly popular offering, the IBM® DB2® Analytics Accelerator, with a set of program temporary fixes (PTFs) and documentation changes. These changes—which are available during May and June—will add several anticipated functions such as reduced data latency and extended capacity of the offering to over 1.2 PB of storage. The following is an overview of these new capabilities, which will be further enhanced in the next release of the offering.
The new Incremental Update capability allows tables within the DB2 Analytics Accelerator to be updated continually throughout the day by reading the database log residing on an IBM DB2 for z/OS® database and applying those updates to the DB2 Analytics Accelerator. With this feature enabled, queries off-loaded to the DB2 Analytics Accelerator operate against a near-real-time version of the data. This feature prevents customers from having to reload the data into the accelerator for a current version. Organizations can use Incremental Update when the workload being accelerated requires the most recent version of the DB2 data. Although many situations will still require full and partition-based refreshes, Incremental Update does not replace the refresh process, but augments it.
Support has been extended to include the entire IBM Netezza® 1000 data warehouse appliance product line. The DB2 Analytics Accelerator now scales from a one-quarter size cabinet to 10 cabinets of Netezza 1000 appliances. This change reflects customers’ demands for larger accelerators to meet the needs of the higher-capacity data warehouses used with IBM System z® servers. The DB2 Analytics Accelerator V2—initially announced in October, 2011—supports a one-quarter, one-half or full-size cabinet, which limits the accelerator to 96 processors with an effective storage of 128 TB. However, by substantially extending the scaling capability, customers can now take advantage of an accelerator that ranges from 24 processors with 32 TB of effective storage up to 960 processors with effective storage of 1,280 TB.
For organizations looking to migrate their operational data stores, data marts or data warehouses onto System z, or to add more capability to an existing decision system, IBM offers a new, combined solution. This complete solution for decision support uses a single System z server that combines high availability, unmatched security and industry-leading query speed within a mixed-workload environment. There are three compelling reasons for companies to consider this option. First, all of the components have been pre-integrated and tested to ensure a simple and fast deployment at the customer’s location. Second, each component has been sized and configured based upon the amount of data to be used and the workload that will be run against that data, assuring the customer of a solution that meets their needs. Finally, it has been solution-priced to provide a cost-effective, high-performance decision system that can outperform the competition while remaining within budgetary constraints.
DB2 for z/OS is an SAP-certified database, and the DB2 Analytics Accelerator is a natural extension to the SAP NetWeaver Business Warehouse (SAP NetWeaver BW) data warehouse solution. Tests being performed at SAP’s facility in Germany are showing substantially increased speed with dramatic decreases in elapsed time for SAP NetWeaver BW ad hoc reporting. Organizations looking to transparently speed up their SAP environment will find the DB2 Analytics Accelerator a fast, easy-to-deploy solution that meets their needs. With plug-and-play capabilities that require no change to the existing environment, all queries are sent to DB2 without changes. Beyond the speed improvement, it also reduces processing cost by moving data processing off System z to the IDAA accelerator and freeing up storage space used for indexing.
Forrester report: Extract business value from social content
IBM white paper: Could your content be working harder—smarter?
And take advantage of open source InfoSphere Streams components
Podcast: Build a business case for real-time analytics
White paper: Deploy Hadoop to gain insights from mainframe data
Big data in a minute: Lighten the big data load