| Job ID: | J317166 |
| Job Title: | Hadoop Data Integration Developer |
| Client: | To Be Discussed Later |
| Location: | SAN FRANCISCO, California |
| Contract Duration: | 7 Months |
| Hourly Rate: | $42/hr Corp-to-Corp Contract |
| Experience Level: | 7+ Years |
| Interview Type: | Phone + Skype |
| Domain Exposure: | IT/Software |
| Work Authorization: | US Citizen, Green Card, H-1B, GC-EAD, H4-EAD, OPT-EAD, L2-EAD, TN Visa, E-3 Visa, CPT |
| Preferred Employment: | Corp-To-Corp Contract, 1099/ Contract |
| Current Status: | Open |
Local Candidates Preferred. Non-local candidates must be willing to pay for your own interview travel expenses and relocation costsRole: Hadoop Data Integration Developer
No of Positions – 5
Duration: 7 months
Rate: $42/hr on C2C (MAX)
Location: SAN FRANCISCO, CA
Job Description:
Hadoop Data integration Developer / Engineer.
Experience with big-data technologies such as Hadoop/Hive, MongoDb, or other NoSQL-based data stores. - Strong experience with traditional RDBMS systems like Oracle & Teradata. - 3+ Years of JAVA ETL / Data Integration Experience. - 2+ Years of experience working with Hadoop ecosystem - Cloudera, Hortonworks or Mapr (preferred). - Comfortable working in Linux environment. - Experience with scripting - shell scripting, python etc. - Profile and analyze large amounts of source system data, including structured and semi-structured data. - Work with data originators to analyze the gaps in the data collected. - Expert-level SQL coding/querying skills is a must. - Conduct ETL performance tuning, troubleshooting & support. - Must be comfortable working in a fast-paced, flexible environment, and take the initiative to learn new tools quickly. - Strong understanding of Unix operating systems and Unix security paradigms - Excellent communication skills and experience with first-level support