Bank Zachodni WBK S.A.
Big Data implemented to Clients Event Processing.
About Bank Zachodni WBK S.A.
Bank Zachodni WBK S.A. is one of the largest and most innovative financial institutions in Poland.
Bank Zachodni WBK S.A. is also one of the most vibrant and fastest-growing banks in Poland today, meeting the needs of millions of personal customers, small or medium-sized enterprises and large corporations. Bank Zachodni WBK S.A. offers complex financial services to the highest standards, supported by modern banking technology.
The main shareholder of Bank Zachodni WBK S.A. is Santander, the number one bank in the Eurozone and the eleventh biggest in the world in terms of market capitalization.
Solution
- Agile
Bank Zachodni WBK S.A. development team had a clear view of architecture needed to solve the problem. Team had a proper background about HDP and SAS components.
With SAS vendor, Bank Zachodni WBK S.A. has formed a small agile team. Short iterative development cycles, great help of Hortonworks consultants and frequent product previews enabled meeting deadlines at ease.
- Architecture
Solution is based on Lambda architecture - architecture taking advantage from both batch and stream processing methods. This approach to architecture attempts to balance latency, throughput, and fault-tolerance by using batch processing to provide comprehensive and accurate views of batch data, while simultaneously using real-time stream processing to provide views of online data.
- Development
During 7 weeks of development team focused on Speed layer which consisted of Kafka and Storm components. Data sources that provide clickstream data were extended by additional Logback appenders to send data directly to Kafka without changing systems’ source code. Serving layer was introduced at the end of the project with HBase as a storage and Hive as a tool for ad-hoc queries and building views for SAS Visual Analytics. Streaming codebase core was implemented bearing in mind that it may be reused in batch processing for loading historical data. Spring Dependency Injection in Storm made topologies implementation agnostic so, for example changing fraud detection algorithm from blacklisting to PMML is transparent from topology point of view. Meanwhile second team implemented Batch layer to feed Visual Analytics with data from Enterprise Data Warehouse. Image below presents how clickstream (Events and Oracle Jolt calls) data move from one component to another in Speed layer and how Batch layer feeds SAS components.
Results
After 7 weeks of development the solution is ready to be implemented on Production environment. The solution scales horizontally – each cluster member produces parallel data processing flow. Metrics to be gathered after rollout, but based on historic data and performance tests this system will be able to easily analyze events, especially wire transfer events, in 6.5 milliseconds after event was sent to Kafka. With unified logging architecture, the solution is extendable with new sources of data with ease.
Endorsement
Hortonworks Professional Services have reviewed this document and confirm that the contents complies with Hortonworks Best Practices. This endorsement applies to the unaltered PDF document digitally signed below only. This endorsement represents a moment-in-time observation and therefore any changes in requirements, technical developments, and other factors can affect its validity over time.
Challenge:
As Bank Zachodni WBK S.A. expands banking into new channels of communications, the reporting and analysis became more complex. The need to constantly monitor site reliability, check transactions against frauds and analyze user behavior in multichannel environment showed that there is a place for Big Data platform.
Solution:
SAS® Visual Analytics
Visual Statistics
Lasr Server
Visit Bank Zachodni WBK on their site.