Software Data Engineer
$139,500 - $258,100/year
Role Details
As a Software Data Engineer on the APX Bedrock team, you will be a hands-on contributor building and enhancing the platform services that power data engineering at Apple. You will work closely with senior engineers and architects to deliver reliable, scalable solutions that have a tangible impact on data teams across the organization. * Design, develop, and maintain Java-based backend services and APIs that support core platform capabilities including metadata collection, dataset lifecycle management, and orchestration * Build and optimize data processing pipelines using distributed computing frameworks * Design and implement data models that accurately represent complex domain concepts and support efficient querying and storage patterns * Implement systems for monitoring dataset health, tracking completeness, and enforcing quality standards * Write clean, well-tested, production-quality code in Java as the primary language, with Scala, Go, or Python as complementary skills * Own database design decisions including schema design, indexing strategies, and query optimization for both relational and non-relational data stores * Participate in operational responsibilities including on-call rotations, incident response, and system health monitoring * Collaborate with peer teams across the platform organization to integrate and deliver cohesive capabilities * Leverage AI-powered development tools to accelerate development and improve code quality * Engage with internal users to understand their needs and translate feedback into platform improvements Bachelor's Degree in Computer Science, Engineering, or equivalent related experience 3+ years of hands-on experience building data applications, backend services, or platform services Strong, production-level programming experience in Java, including building and maintaining RESTful or RPC-based services Solid experience with relational databases including schema design, data modeling, and query optimization using SQL Experience with distributed data technologies such as Apache Spark or Apache Flink, Hadoop, HDFS, Kafka Familiarity with workflow orchestration concepts and tools such as Apache Airflow Demonstrated ability to build, test, and operationally maintain reliable software systems in production environments Experience with non-relational or NoSQL databases and understanding of when to apply different data storage paradigms Proficiency in Scala, Go, or Python as complementary programming languages Familiarity with container orchestration tools such as Kubernetes Exposure to AI-powered development tools and practices (e.g., Claude Code, Copilot) Experience with data quality monitoring, metadata management, or dataset lifecycle tracking Understanding of streaming data architectures and real-time processing patterns Experience with event-driven architectures and distributed messaging systems Strong communication skills and comfort collaborating across teams Curiosity and eagerness to learn new technologies and take on increasing responsibility
For more details click Job Post.
About Apple Inc
Apple Inc. is a multinational technology company known for designing and manufacturing consumer electronics, software, and online services, including the iPhone, Mac, iPad, and App Store. Industry: Consumer Electronics & Software