Position AWS Data Lake Architect Location Columbus, OH Duration Long Term Required Skills for the Position AWS Redshift Architect with very strong Redshift design and implementation experience. Has to have architected solutions that move disparate data from multiple source into a Lake and curate the data for consumption for operational and analytical reporting. Must be very familiar and comfortable with data modeling (logical and physical) as well as best practices of S3 and Redshift db performance optimization design. Experience with Informatica IICS would be great. Ideal to be in Raleigh area but we can live with some level of travel if push comes to shove. A minimum of 10 years of experience in building and architecting enterprise class large data warehouses, ETL both on premise as well as cloud At least 5 years hands-on experience with Big Data Cloud platforms like AWS Redshift (must have), Google Cloud (GCP) Strong experience with Extract, Transform, Load (ETL) or ELT data ingestion, data pipelines Strong experience with data modeling, design patterns, building highly scalable Big Data Solutions and distributed applications. Experience with programmingscripting languages such as JavaPythonScalaR (any combination). Hands-on development experience using open source big data components such Hadoop (Hive, Pig, Spark, Kafka, Sqoop etc.), Amazon Redshift (must have), Google BigQuery is preferred. Expertise with multiple AWS services and hands-on AWS experience with a minimum of one to two referenceable implementations at enterprise scale. Extensive knowledge in designing and configuring AWS services for Data warehouse data migration from on premise to cloud. Strong understandingExperience in the AWS Cloud IaaS and PaaS services such as RDS, Redshift, EC2, S3, Lambda. Hands on with core AWS platform and security architecture, including service account design, virtual private cloud, network design, subnet, segmentation strategies. Good understanding on AWS security services like identity and access management (IAM), Role Policy, key management service (KMS), and audit logging. Experience with data integration technologies like Informatica Cloud Services, Snaplogic is a huge plus. Candidate should possess a Bachelor's degree, andor have a minimum of 12 years of industry experience and 8 years of Metadata management, Data Governance and Data Management experience Associated topics: data administrator, data analyst, data engineer, data management, data quality, data scientist, data warehouse, database administrator, hbase, mongo database
* The salary listed in the header is an estimate based on salary data for similar jobs in the same area. Salary or compensation data found in the job description is accurate.