Trademo empowers customers to take critical supply chain decisions backed with deep insights, find new commercial opportunities, ensure compliance with trade regulations, and build operational supply chain resilience.
However, as the business expanded and more customers began using the platform, they could not scale the operational efficiency of their platform
Every new addition of features and processes caused numerous configuration errors, impacting the platform’s deliverability.
- Manual setup of their AWS Infrastructure failed to gain on-demand scale to accommodate their growing customer base.
- Sub-optimal CI/CD flow presented both development and performance bottlenecks.
- Scattered dashboards with limited visibility provided little scope for early detection and gave way to longer mean time to resolution of glitches.
- Required a high degree of precision while managing tens of terabytes of data, critical for real-time tracking.
- Poor security posture and network configuration.
- To leverage the benefits of multi-cloud flexibility, they aimed to replicate their AWS infrastructure on GCP.
- Future-proofing the platform to make it easier to integrate any new process to support their evolving business.
- Required standardizing their multi-cloud deployments to prevent service disruptions from global configuration changes.
- Seamless GKE setup and streamlined deployments across all environments on GCP via BuildPiper
- Managed multi-cloud setup of their environments – Preprod, Staging, and Production across AWS and GCP
- Ensured consistency and efficiency by implementing CI/CD deployment for all environments in AWS by creating an AMI image of the application and using it to deploy to different environments
- Implemented 360-degree monitoring with a centralized dashboard that integrated with Prometheus, Grafana, and Alert Manager
- Implemented centralized logging solution with EFK stack – Elasticsearch, Fluentd, and Kibana
- Created a reliable data pipeline with open-source tools like – Sentry, Jupyter Notebooks, Jupyter Lab, Apache Airflow, Arrango-db, Rabbitmq, Strap API
- Enabled Multi-factor authentication with IAM in AWS, Service Account in GCP, and OpenVPN setup
- Enabled data storage via Amazon S3 for snapshots of Elasticsearch cluster, MongoDB, Postgres, Airflow dags, and RDS backup.
- Performed complete security audit for finding loopholes and non-compliant procedures
- Implemented WAF for application security
- Deleted unnecessary ami and snapshots. Rightsized volume size of the servers as per usage
Achieved a 100% reduction in deployment time
Standardized deployments across multi-cloud environments – zero global configuration changes
Gained fine-grained control over scaling events
Ensured zero loss and seamless processing of tens of terabytes of data that are critical to real-time tracking updates
Attained significant cloud bill reduction through compounding cost-saving measures
Easy integration of any service with a few clicks via BuildPiper’s guided UI