Web crawlerBuilding a futuristic data service for a major financial operator
Synechron’s client is a financial series operator and data provider that spent a significant amount of time gathering data needed across Edgar Filings, S1, 10K, ESG, 10Q, Corporate Actions, other filings and official company reports to inform its indexes, ETFs and corporate data sales. This information gathering is manual or semi-automated at best. As technology advances, the client was looking to bring more efficiency to this process and bring some automation to the data acquisition process.
Synechron was engaged by the client to automate data acquisition to help build a central repository and metadata store. Based on a list of 100 sites identified for the pilot, Synechron built a smart web crawler to acquire documents from the web for the central repository and validate the accuracy of the document verses the requirements.
We provided the following services:
A global team with a laser focus
How we’ve helped our clients achieve their transformation goals for other large-scale, global programs