Synechron

English

English Dutch
/ / TECHNOLOGY

Web crawlerBuilding a futuristic data service for a major financial operator

Synechron’s client is a financial series operator and data provider that spent a significant amount of time gathering data needed across Edgar Filings, S1, 10K, ESG, 10Q, Corporate Actions, other filings and official company reports to inform its indexes, ETFs and corporate data sales. This information gathering is manual or semi-automated at best. As technology advances, the client was looking to bring more efficiency to this process and bring some automation to the data acquisition process.

Synechron was engaged by the client to automate data acquisition to help build a central repository and metadata store. Based on a list of 100 sites identified for the pilot, Synechron built a smart web crawler to acquire documents from the web for the central repository and validate the accuracy of the document verses the requirements.

Key Team Member Vinod Kumar - Specialist Technology
/ / WHAT WE DELIVERED

We provided the following services:

 

  • Scrappy, open-sourced, Python, web crawling framework
  • Natural Language Processing (NLP)
  • Amazon Web Services (AWS) Managed Services Cloud-based platform
  • Enterprise Data Model
  • Reporting and Information Aggregation
  • Metadata Solution
/ / Results

Standardized the service delivery model for technology across the organization

  • Reduction in data acquisition time
  • Reduced manual analyst review time
  • Streamlined document review process

We're ready to get started, are you?

Get in touch and we can connect you with the right people.