About Client
Business Need
- Comprehensive Data Collection: Gather marketing data from over 150 different sources, including APIs, emails, FTP, and more, to provide thorough insights.
- Data Standardization: Convert raw, diverse data into a consistent and meaningful format for analysis.
- Centralized Data Storage: Load transformed data into a central data warehouse for easy access and analysis.
- Scalable Ingestion and Loading: Efficiently scale the ingestion and loading processes to handle growing volumes of data and new data sources.
- Data Privacy: Ensure all personal data is protected and compliant with privacy regulations like CCPA, CPRA, and GDPR.
Challenges
- Variety of Data Sources: Manage different types of data sources and structures to collect the necessary data. Different APIs sources can be of different nature like a single API call, dependent API calls, paginated API calls, rate limiting APIs etc.
- Scalability: Develop a system that can easily scale to support increasing data volumes and new client instances with minimal effort.
- Timely Data Delivery: Ensure data is delivered on time, meeting service level agreements.
- Large Data Volumes: Efficiently handle and process large amounts of data without system overloads.
- Long-Term Data Tracking: Accurately track marketing data over time to account for delayed user actions. An ad clicked by a user on 25th Feb may lead to a purchase on 1st March. Simply pulling in data for the latest day every time would not give us accurate results.
Our Solution
We’ve crafted a comprehensive data ingestion solution featuring 150+ integrations and low-code workflows for seamless API management. With intelligent rate throttling, sophisticated data preprocessing, and rapid loading mechanisms, our architecture ensures resilience and scalability. Our secure Clean Room Deployment option exemplifies its proficiency in ensuring efficient and secure data handling while prioritizing utmost data privacy.
Extensive Integration Support: Support for 150+ integrations spanning search, sales, analytics endpoints.
Low Code Workflows: Repeatedly writing code to call APIs can be cumbersome. We built a low-code platform which had ready-made reusable components which could encompass a variety of API strategies like:
- Single API Calls: Effortlessly retrieve relevant data with a single API call.
- Paginated APIs: Streamline data retrieval by efficiently handling paginated responses, ensuring all relevant data is captured.
- Dependent APIs: Manage complex data dependencies by orchestrating sequential API calls to gather comprehensive insights.
- Asynchronous Data Fetching Seamlessly manage APIs that return data asynchronously by tracking job statuses and fetching data upon completion.

System Diagram
- Versatile Authentication: Support API key integration, OAuth 2.0, basic authentication, and complex scenarios like Amazon SP-API. Securely store and dynamically retrieve API credentials for robust data security.
- Intelligent Rate limit Management: Employ retries with exponential backoff and counting semaphore mechanisms to overcome rate limits and ensure uninterrupted data flow.
- Sophisticated Data Preprocessor with Streaming Capabilities: Tackle diverse formats like JSON, CSV, JSONL and fixed width files, incorporating transformations such as datetime conversions, regex replacements, mathematical operations, and beyond. Additionally, provide support for streaming to efficiently process large files during transformation.
- Fast Data Loading: Rapid Loading of data into data warehouse and implementing deduplication mechanisms to ensure data integrity.
- Enhanced Data Privacy: For utmost data security and privacy, we offered a clean room deployment option. This would keep sensitive data in the customer’s cloud.
Architecture Overview

The architecture utilizes a blend of AWS services and open-source technologies to establish a resilient and scalable Data Ingestion Framework. Through integration with Redis, MySQL and AWS services like S3, Redshift, ECS, Cloudwatch the system guarantees smooth data ingestion. Below, we outline the key components of the framework:
Low-Code Step Executor
- Interpret and execute a JSON-driven configuration outlining the data flow across multiple steps.
- Employ MySQL to store Customer specific information which serve as inputs to the workflow.
- Enable the execution of straightforward JavaScript expressions based on the response from each step.
- Capable of conducting paginated API calls and handling dependent API calls.
- Configurable parameters for request throttling.
Credential Manager
- Leverage AWS KMS service to encrypt secrets like passwords, API keys, refresh tokens and store it in MySQL.
Resource Manager
- Implemented a distributed counting semaphore using Redis for job management, ensuring a limited number of jobs per API.
- Utilized account-based locks for vendors with shared accounts and customer-based locks for vendors with individual customer accounts.
- Each job acquired a lock upon execution, preventing additional jobs from starting and conserving resources.
Data Preprocessor
- Utilized a configuration-driven data preprocessor for transforming data and files into standard CSV format on S3.
- Configurations stored in MySQL specified data column expectations and extraction methods for the Data Preprocessor input.
- Employed Node.js streams to efficiently manage large datasets without memory accumulation.
Data Loader
- Harnesses the power of Redshift’s COPY command to swiftly load large CSV files, ensuring optimal performance.
- Implements a deduplication mechanism to eliminate duplicate records from the final dataset, enhancing data integrity.
- Prevents potential deadlocks by employing a Redis-based semaphore approach, ensuring exclusive access to tables during data loading.
Clean Room
- A secure deployment environment within the customer’s cloud infrastructure to safeguard sensitive data.
- Data is initially ingested into the customer’s cloud, undergoes encryption of personally identifiable information (PII), and is subsequently transferred to our cloud environment for further processing.
Business Impact
- Scalable Solution: Supports growing data needs and new client requirements with ease.
- Enhanced Decision-Making: Provides a comprehensive, accurate, and up-to-date view of marketing performance, empowering better media investment decisions.
- Improved Compliance: Ensures all data handling practices are in line with the latest privacy regulations, reducing the risk of legal issues.
- Increased Agility: The flexible and scalable architecture allows for quick adaptation to new data sources and changing business needs.
- Competitive Advantage: The ability to integrate and analyze data from a vast array of sources gives Cutsomer’s clients a significant edge in understanding and optimizing their marketing efforts.
- Cost Efficiency and Scalability: Optimizing costs and ensuring scalability to handle growing data volumes efficiently.

Technology










Tech Prescient was very easy to work with and was always proactive in their response.

The team was technically capable, well-rounded, nimble, and agile. They had a very positive attitude to deliver and could interpret, adopt and implement the required changes quickly.

Amit and his team at Tech Prescient have been a fantastic partner to Measured.

We have been working with Tech Prescient for over three years now and they have aligned to our in-house India development efforts in a complementary way to accelerate our product road map. Amit and his team are a valuable partner to Measured and we are lucky to have them alongside us.

We were lucky to have Amit and his team at Tech Prescient build CeeTOC platform from grounds-up.

Having worked with several other services companies in the past, the difference was stark and evident. The team was able to meaningfully collaborate with us during all the phases and deliver a flawless platform which we could confidently take to our customers.

We have been extremely fortunate to work closely with Amit and his team at Tech Prescient.

The team will do whatever it takes to get the job done and still deliver a solid product with utmost attention to details. The team’s technical competence on the technology stack and the ability to execute are truly commendable.
