hub
oxt
arrow
dataConceptTitle
The fast changing world of web development has expanded our skill set and has made us experienced in all major ui frameworks, which makes your application look nice and modern while being and fast.
golangIcon
Golang
mysqlIcon
MySQL
mysqlIcon
React
CLIENT
Dataconcepts
TEAM MEMBERS
4
TIMELINE
5 months
SERVICES
Development
aboutTitle
The customer is a digital marketing agency. They use their own custom CRM for newsletter through SMS and Email and a data aggregation system which was sourcing it via API. Once they started a fruitful partnership with data collection agencies, they got so much data, and it caused target CRM performance issues with inability to handle about 2 millions requests per day.

We’ve come up as a dedicated team to build a V2 data aggregation service, which would be as a middle layer enricher between agencies data and customer’s CRM.
CHALLENGE
They asked us to keep their application’s business logic, but make it generic and scalable. It came up with the custom data workflow configurator service, which allows to manage incoming data, store it, enrich and push to target endpoints, and not stick to specific customer’s CRM. Rework architecture to make it fast, scalable, less demanding to achieve stability and reduce expenses on infrastructure.
numberIcon1
Approach
Their approach and architecture was not optimal, so it was hard to adjust, slow and resource-demanding
numberIcon2
Cost
The cloud AWS architecture cost was about 6000$/month
numberIcon3
Database
AWS lambdas were wrongly used, the database was not properly normalized and indexed
dataConceptCard
DEVELOPMENT
We decided to move from AWS to a dedicated server to gain more freedom in terms of budget and customization. We used Goland and MySQL to develop a microservice-based scalable backend. We chose Golang to maximize server hardware utilization through concurrency and multithreading combined with load and stress tests are really good to maximize code optimization. We achieve flawless code work through a number of stress tests and code improvements.
numberIcon1
Backend capability
The first challenge was to make a backend capable of getting data from 2+ millions requests per day, filtering/enriching/operating data of 200+ millions entries, balancing upload and download using different strategies and workflows.
numberIcon2
Transfer & convert
The second challenge was to transfer and convert ~500 millions records to common db structure.
numberIcon3
Migrate data
Third challenge to migrate legacy data flow logic to a new generic workflow structure.
IMPACT
numberIcon1
Reduced expences
We reduced infrastructure expenses 30 times from 6000$ to 200$ moving from AWS to dedicated server and optimizing code.
numberIcon2
More transparency
We added more transparency to data by adding advanced stats, multi criteria filters, logs.
numberIcon3
Further scalling
We made further scaling much easier due to optimal microservice architecture.
dataConceptJobs
dataConceptJobs
dataConceptJobs
Contact us if you have any projects in mind
Service
Project Budget