DATA INTELLIGENCE

Big Data Analytics

Tackling massive, multi-structured data involves knowing how to collect, decipher and process Big Data, so as to activate the levers of growth and performance in enterprises, whatever their size or economic sector.

It also involves constructing new Business Models to ensure their durability and development.

Finally, it involves creating sustainable competitive advantages by exploiting, on the one hand, the reserves of knowledge stemming from the detailed analysis of new sources of data, and, on the other hand, the anticipatory, even predictive, capabilities developed on the back of such analysis.

  • Valorize massive data by transforming them into usable knowledge
  • Enhance analytical capabilities so as to be better at anticipating and predicting
  • Improve the performance of next-generation digital devices
  • Optimize and develop Business Intelligence architectures

As a leader in Big Data and the co-founder of the École polytechnique's Data Scientist chair, Keyrus possesses the business knowledge, analytical expertise and technological mastery essential to the success of your Big Data projects. 

OUR OFFERS

  • Data-Driven Innovation

    We assist our clients with their project to evangelize analytics, so as to facilitate the emergence of innovative ideas around the valorization of data. Our approach is anchored in the reality of the enterprises concerned and focuses on concrete cases, so as to deliver tangible results.

    • Launch and lead Data-driven Innovation initiatives
    • Collect, utilize, analyze and valorize Data
    • Construct ecosystems and use cases

  • Big Data Architectures & TCO Optimization

    Keyrus helps organizations implement the indispensable foundations of Big Data by designing and deploying next-generation architectures for data valorization. The Group analyzes the specific context and the reality behind each enterprise, so as to formulate appropriate and profitable solutions by valorizing prior investments and the enterprise's data assets.

    • Design and deploy Big Data Roadmaps
    • Evaluate and recommend innovative technologies
    • Prepare Business Cases, undertake cost-benefit analysis (TCO)

  • Data Science Consulting

    Keyrus is conscious that the paradigm change imposed by massive and multi-structured data requires very specialized scientific knowledge and technological skills. Through its partnership with the École polytechnique, it therefore recruits and trains Data Science professionals to assist its clients in designing, and implementing, efficient Big Data algorithms.

    • Develop advanced and tailor-made algorithms (Machine-learning)
    • Put in place highly upgradable predictive and prescriptive analytics solutions : scalability, elasticity, "use-based payment" 

  • Agile Big Data Laboratory

    Keyrus advises the enterprise in defining its processes for validating Big Data Analytics approaches based on concrete and accessible use cases. Sensitive, as it is, to the specific nature of each enterprise, the Group devises a realistic Roadmap with its clients promising quantifiable and measurable economic value.

    • Demonstrate value and associated technological choices
    • Implement an "agile" Big Data experimentation laboratory
    • Identify effective data ecosystems

  • Big Data Service Factory

    By offering rapid prototyping, Keyrus's algorithmic Big Data platform considerably shortens the time periods for implementing your advanced analytical projects. It delivers services in the form of a PaaS - Platform as a Service. Aimed both at experts in Data Science and also at business users, depending on the need, it is highly scalable, and performs all types of algorithms and Data Visualization, whatever the language may be.

    • Propose agile prototyping and Proof of Value (a demonstration of the value created, supported by figures)
    • Show a cost-controlled, rapid increase in analytical capabilities
    • Present upgradable analytical solutions : scalability and elasticity
    • Construct models payable according to use (Big Analytics in the Cloud) 

    REFERENCES

    • Bank of America Merill Lynch

      Forecast time-dependent information (such as notional volumes, revenues, spreads and exchange rates) sliced by appropriate dimensions (such as channel, broker code, client name, etc.) by utilizing historical data. Batch forecasting of an unlimited number of time series - Automatic choice of best-fitting ARIMA or ETS model based on AIC back testing - Customizable forecast horizon and data aggregation (Day, Week, Quarter, Month, Year) - Determination of forecast accuracy (MAE, MAPE) and prediction intervals - STL trend+season+cycle+noise decomposition (noise-to-information ratio) - Signal analysis by means of frequency decomposition (Fourier).

    • BNP Paribas Fortis

      Improve the customer experience by exposing (real-time) personalised content based on a visitor’s behaviour, and by targeting prospects and non-identified and identified customers to eventually increase sales conversion. Transforming semi-structured data to structured data and linking various data sources acquired an extensive preparation work.

    • Borealis

      Borealis is a leading provider of innovative solutions in the fields of polyolefins, base chemicals and fertilizers. Plastic market prices are quite volatile. Keyrus implemented a solution in order to improve the understanding and identify the behaviour of the margin through multiple data sources. Build an accurate algorithm and with convincing results, using Python, R and Weka.

    • Carrefour

      At Carrefour Belgium, Keyrus supports the marketing department with customer insights like segmentations and analysis of promotional offers, based on loyalty card data. Based on the Keyrus deliverables, Carrefour creates personalized offers and analysis of the performance of the related offers afterwards.

    • Daikin

      Absenteeism forcast: Daikin wanted the possibility to forecast easily the number of people that will be on set for a period of three months. Keyrus build an algorithm based on the HR files which forecast efficiency the number of worker who will be presents or not, and build a user-friendly web-based interface to let the HR and production employees use the tool without any knowledge in IT. The only thing they need is to feed the model with the files of their own intern technology.

    • Eneco

      We supported Eneco in better defining their campaign by improving customer targeting. We also provided support in identifying not satisfied customer that are likely to leave to be able to better support them with their interrogations. The key challenge to this project was the collection and the integration of the multiple data sources. Indeed, the data is spread across multiple operational system that require a total integration to build that 360° view of the customer.

    • International banking group

      Categorization of transactions in order to help identify key clients, identify moments of life in the customer lifecycle, … Explore the different data sources and find ways to combine them in order to accurately perform the categorization task, improving on the previously existing model. Key element in future data-driven projects: identification of key clients, identification of moments of life in the customer life cycle, additional input in machine learning algorithms.

    • Major player in beverage industry

      Connect with the consumer in order to better understand the customer journey, customer preferences and sentiments. Drive product innovation based on customer insights and big data analytics. Sentiment analysis of social media, blogs and other online consumer perceptions of specific ingredients of beverages. Dashboard and reporting of analysis to support daily operations (worldwide). Collect, structure and analyze external surveys to better define consumer journey and usage attitudes related to drinking occasions.

    • StepStone

      In order to help StepStone increase his offer, improve the customer experience and increase the transformed leads, Keyrus implemented, as first in Belgium, a SAS on Hadoop solution. This allowed to load a significant amount of web data to perform data discovery, enables compute the huge of data, data preparation using SAS DI with quality check, Data Viz, and use current models and deduct and retain successful models.

    • UCB

      Extend the libraries in R to ensure compatibility with Netezza and build a data analysis framework on top of it. The first goal was to fix compatibility issues between dplyr and Netezza and creating a general purpose data processing framework was. The second goal was to develop a data analysis framework for a specific dataset on Parkinson patients. Here, a high level interface was created that generates inputs for machine learning tasks.

    • Veolia

      Using the development of the smart meters and the smartphone application, Veolia wanted to increase the loyalty of its clients. Based on the amount of data given by the smart meters, Keyrus developed a convincing way to handle the data with its RAYS, and to create a high-performance model using Machine Learning algorithms. The business transformation remains in developing an application helping its customers to manage their energy and water consumption in Real-Time. The models cross the data of the customers, their previous consumption, the consumption of the neighborhood and the consumption of their category.