Christian Rolf

Big data is the bridge between business intelligence and modern business analysis.

By using massive data sets, submarkets can be identified, and strategies developed for targeting them. Historically, this would require several man-years of analysis. Now it can be done efficiently in real-time using high-performance parallel algorithms designed for the cloud. With access to large volumes of data, statistical analysis comes into its own, freeing up companies’ domain-experts to analyse new markets rather than supporting old models.

The main challenge that big data brings is the need to move fast. Acquiring the in-house skills needed to create and deploy large-scale analysis and simulation is a slow process. Many big data tools are readily available, and often open-source. However, the skills required to build a truly scalable system are more important than ever, as performance translates immediately to dollars.

Modern programming paradigms like functional programming, proven to provide the best scalability for cloud computing, are necessary to ensure cost-efficient performance for big data. Functional programming can be efficiently parallelised, thanks to it’s mathematical syntax. This also allows a seamless integration between algorithms and economic models.

Open Parallel is uniquely positioned in the APAC region to ensure that big data analysis is both based on the right models and scales well. Massively parallel execution is part of Open Parallel’s expertise portfolio. Combined with our business experience and access to mathematical talent, we can fast-track your company into the big data age.