Language change icon for desktop
English  |  Dutch  |  French
/ / Technology

Web crawlerBuilding a futuristic data service for a major financial operator

Synechron’s client is a financial series operator and data provider that spent a significant amount of time gathering data needed across Edgar Filings, S1, 10K, ESG, 10Q, Corporate Actions, other filings and official company reports to inform its indexes, ETFs and corporate data sales. This information gathering is manual or semi-automated at best. As technology advances, the client was looking to bring more efficiency to this process and bring some automation to the data acquisition process.

Synechron was engaged by the client to automate data acquisition to help build a central repository and metadata store. Based on a list of 100 sites identified for the pilot, Synechron built a smart web crawler to acquire documents from the web for the central repository and validate the accuracy of the document verses the requirements.

Key Team Member Vinod Kumar - Specialist Technology

We provided the following services:


  • Scrappy, open-sourced, Python, web crawling framework
  • Natural Language Processing (NLP)
  • Amazon Web Services (AWS) Managed Services Cloud-based platform
  • Enterprise Data Model
  • Reporting and Information Aggregation
  • Metadata Solution
/ / Results

Standardized the service delivery model for technology across the organization

  • Reduction in data acquisition time
  • Reduced manual analyst review time
  • Streamlined document review process

A global team with a laser focus

Interested in joining us?

See our current openings

How we’ve helped our clients achieve their transformation goals for other large-scale, global programs