Software Development Engineer, BDT Compute-Engine

Amazon Jobs | Detroit, MI

Applying to this job will open a new window on the employer's web site to apply there.

Posted Date 10/04/2021
Description

Job summary
The BDT/eCF team is looking for engineer with a solid technical background to join the engineering team. Amazon’s technology connects millions of businesses of all sizes to hundreds of millions of Customers within the Amazon.com marketplaces worldwide. Our platform, at Amazon-scale, enables customers to process native SQL, machine learning, and other functional transformations using Apache Spark, Scala, Java, Python - with plans to use Apache Beam runners, Flink, Ray and related technologies to build unified compute for batch, streaming and ML processing - executing over schema’d data stored in S3, and to seamlessly write those curated datasets out to front end caches like Dynamo, Redis and ElasticSearch. Additionally, we enable these same sets of functional transforms over streaming data, enabling customers to transition seamlessly between Streaming, Batch, Cache and Analytics as needed to meet customer demand. The successful candidate will have a background in the development of distributed systems, a solid technical ability, good communication skills, and a motivation to achieve results in a fast paced environment. We are looking for an awesome engineer with the following skills: By submitting your application here, you can apply once to be considered for multiple Software Engineer openings across various Amazon teams. If you are successful in passing through the initial application review and assessment, you will be asked to submit your career and personal preferences so that our dedicated recruiters can match you to the right role based on these preferences. Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us

Key job responsibilities
Build big data scalable engines with cutting edge technology and ML stack (with Spark, Java, Scala, Notebooks, Python, Flink, Beam, AWS - EMR, EKS, Kinesis, Dynamo, SQS) processing & transforming data across data lakes at petabytes scale at Amazon, then look no further. Work with team of engineers that relentlessly innovate and push the envelope keeping customers at the center of its universe, continually insisting + raising the bar on their higher standards and delivering results with velocity.
Specifically, within the team BDT-Engine-Triton an SDE is responsible for ensuring the team’s software maintains a high bar with respect to quality, security, architecture, and operational excellence. They take the lead on the design and delivery of major features and takes the lead on re-architecting significant technology components, while engaging with and influencing their team, external teams, partners, and leadership along the way. They are able to identify the root cause of widespread/pervasive issues including areas where it limits innovation and prevents accelerated delivery of projects, while navigating several systems and components they may or may not own. They are able to effectively communicate with their team and others, take calculated risks, anticipate and mitigate long-term risk, and make reasonable tradeoffs when situation demands it. mentoring less experienced engineers and providing career development opportunities, while providing constructive feedback to their peers. They understand the business impact of decisions and are able to exhibit good judgment while making trade-offs between the team’s short-term technology or operational needs and long-term business needs. Ultimately, they display strong technical acumen and ownership while providing strong leadership for the rest of the team.

A day in the life
This includes attending a daily standup, managing/contributing on your goals, projects, deliverables, innovations, operational excellence, taking turns every 8-10 weeks with operations, helping improving customer experience.

About the team
The scope of the primary product, Cradle, involves working at the architecture level, solving ambiguous problems, taking risks and failing fast, and orchestrating larger, more complex projects with partners both internal and external to the organization. Within the product teams, Triton is responsible for the core execution engine and related services/components. Team ownership includes Dryad Spark Engine (DSE), Spark connectors to Amazon data sources, Engine Release Label (EaRL) Service and Dryad Streaming for processing batch and streaming jobs. These services are imperative to the platform’s success. This drive and impact the key metrics such as job reliability, adherence to SLAs, accessibility and compatibility with Amazon data sources, and overall IMR spend for the platform. Cradle executes an average of 1.19MM jobs each week in clusters spread across 10k+ instances. Cradle jobs produce data consumed by data engineers, SDEs, subsequent data flows, and by S Team-level reporting processes.

BASIC QUALIFICATIONS

· Programming experience with at least one modern language such as Java, C++, or C# including object-oriented design
· 1+ years of experience contributing to the architecture and design (architecture, design patterns, reliability and scaling) of new and current systems.
· 2+ years of non-internship professional software development experience
· At least 3+ years of software development experience
· Strong OO analysis, design, and development skills in Java
· Strong verbal and written communication skills
· Works well in a fast-moving team environment and is able to produce solutions having complex dependencies and requirements
· B.S. in Computer Science, related field, or equivalent work experience

PREFERRED QUALIFICATIONS

· Experience with Spark, Hadoop, REST, Gremlin, Tinkerpop, Kafka, Flink, Beam
· Experience with AWS tech such as Rredshift, EMR, Dynamo, Kinesis, EKS
· Experience working with large commercial relational database systems (Oracle, SQL Server).
· Experience developing Unit Tests using tools such as JUnit, NUnit, MSTest to verify your code quality.
· Experience in the building complex software systems that have been successfully delivered to customers.
· Knowledge of professional software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations.
· Strong written and verbal communication skills preferred.


Amazon is committed to a diverse and inclusive workplace. Amazon is an equal opportunity employer and does not discriminate on the basis of race, national origin, gender, gender identity, sexual orientation, protected veteran status, disability, age, or other legally protected status. For individuals with disabilities who would like to request an accommodation, please visit https://www.amazon.jobs/en/disability/us.

Job Type
Full time

Share this job