Job ID: | J318731 |
Job Title: | Senior Data Engineer |
Client: | To Be Discussed Later |
Location: | New York City, New York |
Contract Duration: | 4 Months |
Hourly Rate: | $78/hr Corp-to-Corp Contract |
Experience Level: | 8+ Years |
Interview Type: | Phone |
Domain Exposure: | Banking & Finance, IT/Software |
Work Authorization: | US Citizen, Green Card, H-1B, GC-EAD, H4-EAD, L2-EAD, TN Visa, E-3 Visa |
Preferred Employment: | Corp-To-Corp Contract, 1099/ Contract |
Current Status: | Open |
LOCATION: New York City CAN THE POSITION BE REMOTE? • No. Work will be done on-site at
CAN THE POSITION BE REMOTE? • No. Work will be done on-site at
Work will be done on-site at the client location.
CONTRACT LENGTH: • 4 Months
START DATE:• On or before 2/13 CLIENT BRIEFING: Caserta Concepts is looking for 3 candidates with industry experience in the financial services / equites space to join their project team. Below is the proposed CONCEPTUAL solution architecture for the project: Ingestion - Pull: ftp retrieval - Push Enable secure push to AWS S3 bucket (if needed) · Bucket Structure Landing, Archive, Test, Config, Reject, Datalake · ETL Processing o Spark (via Databricks) o Job Control, Job Dependencies, Job Tracking o Metadata catalog o Data processing design patterns · Orchestration, Workflow & Automation · Architecture Components o AWS S3, EC2 o Databricks (Spark, Orchestration, Workflow & Job Scheduling) o AWS Redshift (Data presentation) o Query & UI – TBD, potentially QuickSight, Looker, etc.
CLIENT BRIEFING: Looking for 3 candidates with industry experience in the financial services / equites space to join their project team. Below is the proposed CONCEPTUAL solution architecture for the project: Ingestion - Pull: ftp retrieval - Push Enable secure push to AWS S3 bucket (if needed) · Bucket Structure Landing, Archive, Test, Config, Reject, Datalake · ETL Processing o Spark (via Databricks) o Job Control, Job Dependencies, Job Tracking o Metadata catalog o Data processing design patterns · Orchestration, Workflow & Automation · Architecture Components o AWS S3, EC2 o Databricks (Spark, Orchestration, Workflow & Job Scheduling) o AWS Redshift (Data presentation) o Query & UI – TBD, potentially QuickSight, Looker, etc.
Senior Data Engineer
Responsibilities:
• Perform process analysis, source system analysis, and data profiling
• Translate solution requirements to design
• Define solution processes and data transformations and implement utilizing Spark
• Implement technologies and tools in support of development of the M&A Shareholder Analytics platform Qualifications:
• Must have financial services / equities/data exposure
• Must have minimum 8 years’ experience