SUCCESS CASES

We deliver high value-added and high-performance IT services and projects to our customers, with quality assurance and proven results. Learn our success cases and how we achieved such results!

FINANCIAL SERVICES INDUSTRY

Using Leega Artificial Intelligence, companies can have a better management of financial services, with tools that offer all the innovation for payment automation, profit calculations, internal cost reduction, in addition to a clear and effective view of the organization financial health.

DATA LAKE HUB PROJECT

PROJECT SCOPE:

Implementation of a Data lake to become a data Hub between legacy systems, in order to improve the organization, and provide data enrichment and availability for multiple uses.

 

USED TECHNOLOGIES:

Pub/Sub, Data Flow, Spanner, App Engine, Cloud Endpoints.

TECHNICAL SOLUTION:

We perform the maintenance of hot copies of transactional databases for data reading, qualification and organization for later use and historical databases, in addition to the enrichment of data from internal sources.

 

HOURS VOLUME/PROJECT TIME:

4,000 hours / 7 months (Dec/2020)

RETAIL INDUSTRY

To stand out in the market, retail companies are aware about the importance of allying themselves with specific technological tools, such as Analytics and Big Data, which enhance operations by collecting data and information relevant to the business.

DATA LAKE PROJECT
MIGRATION OF DW TERADATA TO BIGQUERY

PROJECT SCOPE:
The challenge was to implement a cloud-based information environment, containing a Data Lake (Raw data) and Data Marts (consolidated).

HOURS VOLUME/PROJECT TIME:
13,400 hours / 14 months (Apr/2020)

TECHNICAL TEAM:
Project Manager; Functional Analysts; SW Leega Factory (Data Engineer, Front end Developers).

TECHNICAL SOLUTION:
The current DW (Teradata) and DMs (SQL Server) customer environments were migrated to the Google Cloud environment, in the AS IS format and adapted to the consumption tools to read the data in the new environment. DW Teradata: 250 tables, 50 views, 298 queries, 77 procedures. The Data Lake was implemented using Google Cloud Storage and BigQuery tools for data storage, and ETL processes were created using Cloud Composer and BQ Load, while maintaining the Dashboards application in Qlik.

DATA LAKE PROJECT
MIGRATION OF DW TERADATA TO AZURE

PROJECT SCOPE:

Perform the Data migration from DB Teradata to Azure Synapse, including the review/adjustments of the extraction/loading processes (SAS DI, SAS Guide, Datastage and Procedures) implemented by the Analytics area.

TECHNICAL TEAM:

We reviewed and transcribed the ETL processes and Procedures, pointing to Azure Synapse, and applying the implemented reference architecture, valuing the rules maintenance and their high performance.

SAS DI/Guide: 295
IBM Datastage: 730
Procedures Teradata: 70

HOURS VOLUME/PROJECT TIME:

4,000 hours / 3 months

TECHNICAL TEAM:

Scrum Master; Functional Analysts; SW Leega Factory (Azure Specialists, Datastage, SAS and Data Engineer).

RESULTS:

We performed the review and transcription of the processes, aiming the:
Application of the implemented reference architecture.
Guarantee of rules maintenance and high performance.

HEALTHCARE INDUSTRY

Healthcare companies need to have quality control in all their operations, and solutions such as Leega Data Analytics and Artificial Intelligence provide more technology and innovation to data management in these institutions, actively supporting their development.

 

ADVANCED ANALYTICS PROJECT WITH ML (RECOMMENDATION OF PRODUCT BASKET)

PROJECT SCOPE:
In this project, our goal was the implementation of a Cloud Data Analytics Platform, the creation of Data Lake Layers (Raw Data) to store raw data, Data Science Sandbox to store statistical models and databases used by Data Scientists, Production for the storage of data that can be used in the Web Portal system, Analytics and Data Visualization to store processed and consolidated data that will be the source for the queries – including the development of the Data Ingestion and Orchestration processes.

 

TECHNICAL TEAM:
A data integration layer was created using Google Cloud BigQuery and statistical models were created using the Google Cloud AI Notebook platform.

HOURS VOLUME/PROJECT TIME:

2 months (May/2020)


TECHNICAL TEAM:

Project Manager, SW Leega Factory (Data Engineer and Data Scientist).

 

RESULTS:

Raw data storage, statistical models and databases used by data scientists.
Development of data ingestion and orchestration processes.
Transformation of customer information maturity by strengthening the analytics culture across the organization and extracting value from data assets in line with business strategy.

INSURANCE INDUSTRY

Insurance companies need to rely on partners who know the culture, market, and business that they operate. In this regard, Leega works to transform information into data and offer powerful analytical tools for an efficient management in the insurance industry.

 

 

PROJECT ANALYTICS

PROJECT SCOPE:

Implement Analytics products to track partners’ engagement in customer-defined collaboration programs, sales and marketing indicators follow-up, and other products, in addition to providing information on the activity goals of partners on the program portal, and generating indicators for marketing analysts to assist in the partners’ engagement analysis in the program.

 

TECHNICAL TEAM:

We cross-referenced data from various systems to create indicators. Data was made available for analysis through queries and dashboards, and advanced analytics was also implemented with the work of a data scientist to discover new insights. We implemented processes to feed DW (Google BigQuery), created data load processing using Cloud Computing / DataProc / Cloud Composer, ML Engine and developed Dashboards using Tableau and Google Data Stud.

HOURS VOLUME/PROJECT TIME:

8,400 hours / 10 months (Jan/2020)

 

TECHNICAL TEAM:

Squad Ágil: Scrum Master; Data Engineer / Data Visualization Developer / Data Scientist

 

RESULTS:

Provision of information on goals and objectives achieved in the activities performed by the partners on the program portal.
Generation of indicators to marketing analysts to assist in the analysis of partners’ engagement in the program.
Use of dashboards and advanced analytics to discover new insights.

TECHNOLOGY INDUSTRY

With the digital transformation, technology companies have proven to be even more adept at digital tools to generate opportunities and growth in their area of expertise. However, they need to adapt to the new consumer demands and offer increasingly satisfying consumer experiences.

 

Projeto Data Lake

PROJECT SCOPE:

Migration of the transactional application from Azure Cloud environment to Google Cloud. Our challenge was to implement a Cloud Information Environment, containing a DW (consolidated) with information from customers regarding application and creation of OCR processes to capture invoices, and create Machine Learning models to evaluate the customers profile and deviations in the consumption of contracts. Data was available for consumption through dashboards.

 

TECHNICAL TEAM:

DW was implemented using Google BigQuery tools for data storage, ETL processes were created using Cloud Composer, DataFlow, and Dashboards were created in Google Data Studio, used embedded in the application. In addition, the construction of OCR processes using Google Cloud AI APIs (Vision API) and Machine Learning processes using Google ML Engine and DataProc.

HOURS VOLUME/PROJECT TIME:

3,000 hours / 6 months (Dec/2019)

 

TECHNICAL TEAM:

Project Manager; Functional Analysts; SW Leega Factory (Data Engineer, Front end Developers).

 

RESULTS:

Provision of customer information and creation of OCR processes to capture invoices.
Evaluation of the customer profile and deviations in the consumption of contracts.
Data availability through dashboards.


INDUSTRY SECTOR

The lack of specific technologies is still one of the factors that prevent industries from all sectors from being ahead of their competitors. These companies need to invest more in security, connectivity, Artificial Intelligence, and innovative tools to stand out in the market.

 

DATA LAKE PROJECT

PROJECT SCOPE:

Implement a Data Lake environment to store Training data and create performance indicators for evaluation by authorized dealers.

 

TECHNICAL TEAM:

DW was implemented using Google BigQuery tools for data storage, ETL processes were created using Cloud Composer, DataFlow, and Dashboards were also created in Google Data Studio and Power BI.

HOURS VOLUME/PROJECT TIME:

1,500 hours / 3 months (Mar/2020)

 

TECHNICAL TEAM:

Project Manager; Functional Analysts; SW Leega Factory (Data Engineer, Front end Developers).

 

RESULTS:

Training data storage.
Creation of performance indicators for the evaluation of authorized dealers.