As startups in India and across the world continue to build inspiring products to solve contrasting problems, the hunt to scale the apps for providing delightful user experiences also grows. They follow the inherent culture of moving fast and breaking things, to build the MVP faster justifying the problem-solving and business strategies. Premature optimization is consciously avoided to prove the mettle of the product being built. Once the idea is gracefully adopted, the journey of 1 to 100 begins. This article is a story of why and how the Experience Engineering Team migrated from AWS S3 to Cloudinary to serve optimized media to our mobile and web apps.
When an idea is born, the app around it is built in the fastest and crudest way possible. Like other startups, LazyPay apps were developed in a similar manner. The media like images, videos, and animations(Lottie) being served on the apps were stored on AWS S3 which were consumed by apps directly through the S3 URLs. The brute-force approach obviously had problems at many levels.
Performance Bottlenecks: Although we backed our S3 buckets with AWS CloudFront for utilizing caching and security gains, the apps were getting media of the same sizes that were stored by us leading to higher bandwidth consumption.
Screen Size and Rendering Issues: To be highly responsive, the APIs should serve the response with millisecond latency. Unfortunately, this is not the case most of the time, especially over media. This becomes more painful when a user on a mobile device requests for an asset and an unwanted cache-miss leads to the slow screen rendering making the app unusable.
Page Load Time: Since the media being served was not optimized, it directly affected our page load time leading to delayed interaction by the user with our product offerings. This became even worse when our users were connected to a 4G network and residing in any low connectivity zone or even roaming.
Heavy Maintenance: To support multiple clients and optimize asset delivery, we were uploading multiple variants of the same image based on the platform(android/iOS/web) and device/browser density. For ex: If we had a banner for promoting Xpress Loan on the app, we would be uploading 7 variants of this image ranging from
xxxhdpion Android and
3xon iOS respectively.
Unstructured asset management: Since every client was handling the assets according to their own needs, there was no order to the files that were being uploaded, creating a mess of buckets and folders everywhere.
Cloudinary as the Media Optimizer
AWS S3 for cloud storage of assets is the right choice and we are sticking to it. However, we have laid out a consistent process around asset management and brought a true order to it. Also, to reduce the network footprint of our apps and enhance their performance, we have invested a lot in adopting Cloudinary across all of our clients.
Cloudinary is a powerful media hosting and optimization platform that delivers high-quality media without much maintenance effort.
From S3 via Cloudinary
Connecting S3 to Cloudinary is pretty straightforward.
Media Sourceon Cloudinary for the S3 bucket
Optimization Profileon Cloudinary and attach the media source with it.
Provide the base transformation on the profile as one of the custom transformations.
Ease of Maintenance
The best part of using Cloudinary is its zero maintenance effort. Recall, when we said earlier that we were uploading 7 variants of an image to support multiple clients. Not anymore! With Cloudinary, we are now uploading only one master image which serves all our clients without compromising on the app performance.
Cloudinary can serve different variants of the master image on the fly via transformations. We have created named transformations for all of our clients to not only serve the optimized asset according to the device/browser but also shorten the image URLs. These transformations have helped significantly in lowering our maintenance efforts and increasing our productivity.
On Android, we have 5 named transformations mapping with
mdpi, hdpi, xhdpi, xxhdpi, xxxhdpicreated based on display metrics.
On iOS, we have 3 named transformations mapping with
1x, 2x and 3xcreated based on the scale factor.
On the web, we have 3 named transformations mapping with
small, medium and largecreated based on aspect ratio.
LazyPay android app requests an image on a medium display density device -> Android app appends the transformation(
tx=t_droid_xhdpi) dynamically on the requested Cloudinary image URL -> Cloudinary checks for the transformed image in cache -> If cache hits, serve the image, else transform the image from the master one on the fly, cache, and then serve.
Cloudinary’s out-of-the-box image optimizations helped in further reducing data consumption.
An image can be served in any format
pngautomatically by Cloudinary optimized based on clients (Android/iOS/Web).
Cloudinary’s multi-CDN helps in delivering the images faster and more reliably.
Bytes saved are bytes gained for processing other high-intensive APIs within the app. Using Cloudinary helped us to save 70% of the bytes when compared to the originals. For eg: An image of 100 KB on S3 required only 30 KB to be served via Cloudinary.
Apart from the byte savings, we even monitored the performance of the assets being served through Cloudinary and compared them with AWS S3 through Firebase Performance Monitoring. Monitoring insights suggested that image URLs served through Cloudinary were performing faster by 20% as compared to S3.
Go-to-market through controlled feature rollout 🚀
Migration to Cloudinary was a big change for us and we wanted to tread very carefully before adopting it on all platforms. Launching in one shot might have given an experience where the user would not be seeing any asset on our app giving the worst user experience. Reasons could be many: Dirty Setup, Missing Media, Heavy Loads, etc. That’s why, we controlled the feature and rolled it out incrementally through Firebase Remote Config. Cloudinary’s error reporting facilitated the process by properly displaying the asset URLs giving HTTP 404s & 400s. We kept on stabilizing the error count and increased the rollout only when the error graph had flattened out. While doing so, we also uncovered some new and age-old issues.
HTTP 404 errors received on Cloudinary were for the assets that were never uploaded and missing on S3. This is a typical case of a hibernating bug finally waking up. We had to find those assets somehow, upload them on S3, and fix the errors.
Since we added a blanket transformation across all of our assets uploaded on S3, this impacted assets like
Error from Cloudinary: Maximum image size is 8 Megapixels. Requested 9.72 Megapixels
We then added a filter to exclude the media types like
gif and pdffrom being transformed.
We are already using Cloudinary for serving assets on all of our mobile apps and slowly rolling it out to our web apps.
FinTech has many unique challenges and as we are progressing, we are sharpening our skills, re-architecting the apps, and solidifying our infrastructure to build for India scale. We are challenging the status quo at each moment all aimed at giving an awesome user experience. Stay tuned for the next one!