This job is right for you if you like:
- Solving problems that make a real difference in people’s lives and wellbeing.
- Rapid growth and the ability to make a personal, direct impact on strategy/execution.
- Guiding innovative products, services, and processes from initial concept to user adoption
- Rockstar teammates: an unparalleled team with decades of prior work experience in artificial intelligence, software systems, molecular biology, clinical oncology, clinical and regulatory operations, and related fields
xCures helps patients and their doctors beat advanced cancer. We operate an AI-assisted platform that provides highly tailored, up-to-date information regarding the optimal therapies to consider in specific patient cases. In doing so, xCures’ platform prospectively generates real-world evidence for clinical studies and decentralized trials.
About the role
Reporting to the VP of Engineering, the AWS Data Architect is a member of the Data Science and Engineering team. This role will require expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms, both internal and external.
The AWS Data Architect will work with other Software Engineers and business leaders responsible for different xCures data assets to understand data requirements, and to build ETL to ingest the data into structures appropriate for analysis, data science, and reporting. The successful candidate will be an authority for crafting, implementing, and operating stable, scalable, cost-effective solutions to flow data from production systems into a data lake. Above all, this position requires being passionate about working with very large-scale data sets and a true love of bringing datasets together to answer business questions and drive development and growth.
Essential Duties and Responsibilities:
- Work with multiple business teams to develop, optimize and maintain xCures Data Science and Engineering data assets.
- Design, construct, and test data models, including large-scale data repositories optimized for reporting and data as a service that may influence or drive architectural changes.
- Architect and implement AWS services and features such as S3, Redshift, Lambda, Glue, Data Pipeline, CloudFormation, and EMR to support efficient data ingestion, processing, transformation, and storage.
- Coordinate activities with data source application owners to ensure optimum integration, data integrity, and data quality.
- Configure and optimize AWS resources to ensure efficient data processing and analysis, including performance tuning, workload management, and query optimization.
- Implement and maintain data governance and security measures, including encryption, access controls, and compliance with relevant regulations (e.g., GDPR, HIPAA).
- Develop and maintain scripts and automation tools for deploying, monitoring, and maintaining AWS-based data infrastructure.
- Ensure production service levels, performance quality, and resolution of data load failures.
- Translate business requirements into ETL designs and mapping specifications, with an emphasis on maintaining accurate information on data provenance.
- Build and maintain AWS infrastructure components such as VPCs, subnets, security groups, IAM roles, and policies to support data architecture initiatives.
- Coordinate with the xCures security team to ensure all data asset security requirements are met.
- Collaborate with xCures business teams to plan new features, as needed.
- Participate in planning and scoping meetings for future projects.
- Document and communicate technical designs, processes, and guidelines to ensure effective knowledge sharing and collaboration within the team.
- Mentor and train fellow team members on technologies, design patterns, and best practices.
- Keep abreast of industry trends and present findings to team, leadership, and stakeholders.
- Research and resolve issues in a timely manner, identifying root cause and implementing sound technical resolutions.
- Other duties may be assigned.
Worksite Location: Fully remote; occasional travel
To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. The requirements listed below are representative of the knowledge, skill, and/or ability required. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions.
Required Skills and Qualifications:
- Proven experience as a Data Architect or Data Engineer with a strong focus on managing configuration and workloads on AWS.
- In-depth understanding of AWS cloud services and architecture, particularly for data engineering and analytics use cases.
- Expertise in deploying and managing AWS resources using infrastructure-as-code tools such as CloudFormation or Terraform.
- Proficiency in programming languages such as Python, Java, Scala, or TypeScript for data processing, scripting, and automation tasks.
- Hands-on experience with relational database systems such as PostgreSQL, MySQL, or SQL Server.
- Hands-on experience with AWS data services like RDS, DynamoDB, S3, Redshift, Glue, Athena, EMR, and Data Pipeline.
- Familiarity with data warehousing concepts and experience with data modeling techniques.
- Solid understanding of SQL and experience with query optimization and performance tuning in AWS environments.
- Knowledge of data governance, security, and compliance practices in cloud environments.
- Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment.
- Excellent communication and interpersonal skills, with the ability to effectively communicate complex technical concepts to both technical and non-technical stakeholders.
Education and/or Experience:
Bachelor’s degree from four-year college or university and 4 years’ experience or master’s degree in Computer Science; or comparable experience and/or training; or equivalent combination of education and experience. BS in Computer Science preferred.
- Working with large-scale databases, collection, and organization of real-time event streaming data.
- Working with Dimensional, Entity-Relationship, Tabular models, and OLAP data modeling.
- A proven track record in delivering in an agile environment, while managing multiple priorities.
- Practical experience with Continuous Integration/Continuous Deployment (CI/CD).
- Experience with Git/Github or comparable distributed version control system.
- Experience working with AWS Code pipeline.
- AWS certifications such as AWS Certified Big Data Specialty, AWS Certified Data Analytics Specialty, or AWS Certified Solutions Architect are a plus.
If you possess the required skills and have a passion for leveraging AWS technologies to architect scalable and robust data solutions, we would love to hear from you. Join our team and contribute to shaping the future of our data infrastructure with cutting-edge technologies and innovative approaches.
xCures offers a flexible and affordable benefits program designed to help you be well, including medical, dental, & vision coverage, vacation & sick time, holiday pay, and a 401(k) plan.
$120,000 – $180,000
xCures is an equal opportunity employer valuing workforce diversity.
To apply, please send your cover letter and resume to: firstname.lastname@example.org