
A pricing team at a mid-sized retailer wanted to understand their competitive position across their catalog. Reasonable question. The answer required checking 50,000 product pages daily. At 30 seconds per page—if you could somehow maintain that pace without breaks—that's 417 hours of work. Every single day.
They never tried. The work remained undone because it couldn't be scoped.
As Box CEO Aaron Levie recently noted:
We're not talking about efficiency gains. We're talking about work that enterprises never attempted because the economics were impossible.
At TinyFish, we build enterprise web agent infrastructure—systems that handle reliable automation across thousands of sites simultaneously. Building this infrastructure shows you exactly why certain monitoring tasks remained undone for human teams.
The web actively fights back. Authentication flows change without warning. A retailer checking competitor pricing finds that a site working perfectly yesterday now requires two-factor verification. Or the login page restructures. Or regional variations mean the flow works in the US but breaks in Japan. Each site has dozens of these edge cases.
Product matching creates its own nightmare. The same item appears with different descriptions, images, and naming conventions across hundreds of stores. Even AI-driven matching starts at 80-90% accuracy, requiring human validation to approach 100%. Multiply that across thousands of SKUs and the validation work alone becomes impossible to resource.
The data fragments. Price lives in one platform, sentiment in another, stock levels buried in outdated reports. You need to pull from desktop sites, mobile sites, apps—each requiring different technical approaches. The work isn't just large. It's architecturally impossible for human teams.
But the temporal dimension breaks everything. Amazon adjusts prices every few minutes. Even if you could check multiple times daily, gathering information from all vendors will take months. By the time you finish, the information you collected three months ago has changed. You're always working with stale data because the collection cycle never completes.
So retailers made pricing decisions blind. They ran promotions without competitive context. They managed inventory based on quarterly snapshots. Not because they didn't understand the value of real-time data, but because comprehensive monitoring "can seem impossible".
When enterprises realize they could actually know things they've been operating blind on, something shifts. The conversation stops being about automation capabilities and becomes about recognition: "Wait, we've been making decisions without this information?"
The work that stayed undone:
The work existed. The information was theoretically available. But until infrastructure existed to make it economically viable, certain questions simply couldn't be answered. The numbers never added up.